Search Immortality Topics:

Page 103«..1020..102103104105..110120..»


Category Archives: Machine Learning

Global Machine Learning As A Service (Mlaas) Market : Industry Analysis And Forecast (2020-2027) – MR Invasion

Global Machine Learning as a Service (MLaaS) Marketwas valued about US$ XX Bn in 2019 and is expected to grow at a CAGR of 41.7% over the forecast period, to reach US$ 11.3 Bn in 2027.

The report study has analyzed revenue impact of covid-19 pandemic on the sales revenue of market leaders, market followers and disrupters in the report and same is reflected in our analysis.

REQUEST FOR FREE SAMPLE REPORT:https://www.maximizemarketresearch.com/request-sample/55511/

Market Definition:

Machine learning as a service (MLaaS) is an array of services that offer ML tools as part of cloud computing services. MLaaS helps clients profit from machine learning without the cognate cost, time and risk of establishing an in-house internal machine learning team.The report study has analyzed revenue impact of covid-19 pandemic on the sales revenue of market leaders, market followers and disrupters in the report and same is reflected in our analysis.

Machine Learning Service Providers:

Global Machine Learning as a Service (MLaaS) Market

Market Dynamics:

The scope of the report includes a detailed study of global and regional markets for Global Machine Learning as a Service (MLaaS) Market with the analysis given with variations in the growth of the industry in each regions. Large and SMEs are focusing on customer experience management to keep a complete and robust relationship with their customers by using customer data. So, ML needs to be integrated into enterprise applications to control and make optimal use of this data. Retail enterprises are shifting their focus to customer buying patterns with the rising number of e-commerce websites and the digital revolution in the retail industry. This drives the need to track and manage the inventory movement of items, which can be done using MLaaS. The use of MLaaS by retail enterprises for inventory optimization and behavioral tracking is expected to have a positive impact on global market growth.Apart from this, the growing trend of digitization is driving the growth of the MLaaS market globally. Growth in adoption of cloud-based platforms is expected to positively impact the growth of the MLaaS market. However, a lack of qualified and skilled persons is believed to be the one of the challenges before the growth of the MLaaS market. Furthermore, increasing concern toward data privacy is anticipated to restrain the development of the global market.

Market Segmentation:

The report will provide an accurate prediction of the contribution of the various segments to the growth of the Machine Learning as a Service (MLaaS) Market size. Based on organization size, SMEs segment is expected to account for the largest XX% market share by 2027. SMEs businesses are also projected to adopt machine learning service. With the help of predictive analytics ML, algorithms not only give real-time data but also predict the future. Machine learning solutions are used by SME businesses for fine-tuning their supply chain by predicting the demand for a product and by suggesting the timing and quantity of supplies vital for satisfying the customers expectations.

Regional Analysis:

The report offers a brief analysis of the major regions in the MLaaS market, namely, Asia-Pacific, Europe, North America, South America, and the Middle East & Africa.North America play an important role in MLaaS market, with a market size of US$ XX Mn in 2019 and will be US$ XX Mn in 2027, with a CAGR of XX% followed by Europe. Most of the machine learning as service market companies are based in the U.S and are contributing significantly in the growth of the market. The Asia-Pacific has been growing with the highest growth rate because of rising investment, favorable government policies and growing awareness. In 2017, Google launched the Google Neural Machine Translation for 9 Indian languages which use ML and artificial neural network to upsurges the fluency as well as accuracy in their Google Translate.

Recent Development:

The MMR research study includes the profiles of leading companies operating in the Global Machine Learning as a Service (MLaas) Market. Companies in the global market are more focused on enhancing their product and service helps through various strategic approaches. The ML providers are competing by launching new product categories, with advanced subscription-based platforms. The companies have adopted the strategy of version up gradations, mergers and acquisitions, agreements, partnerships, and strategic collaborations with regional and global players to achieve high growth in the MLaaS market.

Such as, in April 2019, Microsoft developed a platform that uses machine teaching to help deep strengthening learning algorithms tackle real-world problems. Microsoft scientists and product inventors have pioneered a complementary approach called ML. This relies on people know how to break a problem into easier tasks and give ML models important clues about how to find a solution earlier.

DO INQUIRY BEFORE PURCHASING REPORT HERE:https://www.maximizemarketresearch.com/inquiry-before-buying/55511/

The objective of the report is to present a comprehensive analysis of the Global Machine Learning as a Service (MLaaS) Market including all the stakeholders of the industry. The past and current status of the industry with forecasted market size and trends are presented in the report with the analysis of complicated data in simple language. The report covers all the aspects of the industry with a dedicated study of key players that includes market leaders, followers and new entrants by region. PORTER, SVOR, PESTEL analysis with the potential impact of micro-economic factors by region on the market has been presented in the report. External as well as internal factors that are supposed to affect the business positively or negatively have been analyzed, which will give a clear futuristic view of the industry to the decision-makers.

The report also helps in understanding Global Machine Learning as a Service (MLaaS) Market dynamics, structure by analyzing the market segments and projects the Global Machine Learning as a Service (MLaaS) Market size. Clear representation of competitive analysis of key players by Application, price, financial position, Product portfolio, growth strategies, and regional presence in the Global Machine Learning as a Service (MLaaS) Market make the report investors guide.Scope of the Global Machine Learning as a Service (MLaaS) Market

Global Machine Learning as a Service (MLaaS) Market, By Component

Software ServicesGlobal Machine Learning as a Service (MLaaS) Market, By Organization Size

Large Enterprises SMEsGlobal Machine Learning as a Service (MLaaS) Market, By End-Use Industry

Aerospace & Defense IT & Telecom Energy & Utilities Public sector Manufacturing BFSI Healthcare Retail OthersGlobal Machine Learning as a Service (MLaaS) Market, By Application

Marketing & Advertising Fraud Detection & Risk Management Predictive analytics Augmented & Virtual reality Natural Language processing Computer vision Security & surveillance OthersGlobal Machine Learning as a Service (MLaaS) Market, By Region

Asia Pacific North America Europe Latin America Middle East AfricaKey players operating in Global Machine Learning as a Service (MLaaS) Market

Ersatz Labs, Inc. BigML Yottamine Analytics Hewlett Packard Amazon Web Services IBM Microsoft Sift Science, Inc. Google AT&T Fuzzy.ai SAS Institute Inc. FICO Predictron Labs Ltd.

MAJOR TOC OF THE REPORT

Chapter One: Machine Learning as a Service Market Overview

Chapter Two: Manufacturers Profiles

Chapter Three: Global Machine Learning as a Service Market Competition, by Players

Chapter Four: Global Machine Learning as a Service Market Size by Regions

Chapter Five: North America Machine Learning as a Service Revenue by Countries

Chapter Six: Europe Machine Learning as a Service Revenue by Countries

Chapter Seven: Asia-Pacific Machine Learning as a Service Revenue by Countries

Chapter Eight: South America Machine Learning as a Service Revenue by Countries

Chapter Nine: Middle East and Africa Revenue Machine Learning as a Service by Countries

Chapter Ten: Global Machine Learning as a Service Market Segment by Type

Chapter Eleven: Global Machine Learning as a Service Market Segment by Application

Chapter Twelve: Global Machine Learning as a Service Market Size Forecast (2019-2026)

Browse Full Report with Facts and Figures of Machine Learning as a Service Market Report at:https://www.maximizemarketresearch.com/market-report/global-machine-learning-as-a-service-mlaas-market/55511/

About Us:

Maximize Market Research provides B2B and B2C market research on 20,000 high growth emerging technologies & opportunities in Chemical, Healthcare, Pharmaceuticals, Electronics & Communications, Internet of Things, Food and Beverages, Aerospace and Defense and other manufacturing sectors.

Contact info:

Name: Vikas Godage

Organization: MAXIMIZE MARKET RESEARCH PVT. LTD.

Email: sales@maximizemarketresearch.com

Contact: +919607065656/ +919607195908

Website: http://www.maximizemarketresearch.com

Here is the original post:
Global Machine Learning As A Service (Mlaas) Market : Industry Analysis And Forecast (2020-2027) - MR Invasion

Posted in Machine Learning | Comments Off on Global Machine Learning As A Service (Mlaas) Market : Industry Analysis And Forecast (2020-2027) – MR Invasion

Is Machine Learning Model Management The Next Big Thing In 2020? – Analytics India Magazine

ML and its services are only going to extend their influence and push the boundaries to new realms of the technology revolution. However, deploying ML comes with great responsibility. Though efforts are being made to shed its black box reputation, it is crucial to establish trust in both in-house teams and stakeholders for a fairer deployment. Companies have started to take machine learning model management more seriously now. Recently, a machine learning company Comet.ml, based out of Seattle and founded in 2017, announced that they are making a $4.5 million investment to bring state-of-the-art meta-learning capabilities to the market.

The tools developed by Comet.ml enable data scientists to track, compare, monitor, and optimise model development. Their announcement of an additional $4.5 million investment from existing investors Trilogy Equity Partners and Two Sigma Ventures is aimed at boosting their plans to domesticate the use of machine learning model management techniques to more customers.

Since their product launch in 2018, Comet.ml has partnered with top companies like Google, General Electric, Boeing and Uber. This elite list of customers use comet.al services, which have enterprise-level toolkits, and are used to train models across multiple industries spanning autonomous vehicles, financial services, technology, bioinformatics, satellite imagery, fundamental physics research, and more.

Talking about this new announcement, one of the investors, Yuval Neeman of Trilogy Equity Partners, reminded that the professionals from the best companies in the world choose Comet and that the company is well-positioned to become the de-facto Machine Learning development platform.

This platform, says Neeman, allows customers to build ML models that bring significant business value.

According to a report presented by researchers at Google, there are several ML-specific risk factors to account for in system design, such as:

Debugging all these issues require round the clock monitoring of the models pipeline. For a company that implements ML solutions, it is challenging to manage in-house model mishaps.

If we take the example of Comet again, its platform provides a central place for the team to track their ML experiments and models, so that they can compare and share experiments, debug and take decisive actions on underperforming models with great ease.

Predictive early stopping is a meta-learning functionality not seen in any other experimentation platforms, and this can be achieved only by building on top of millions of public models. And this is where Comets enterprise products come in handy. The freedom of experimentation that these meta learning-based platforms offer is what any organisation would look up to. Almost all ML-based companies would love to have such tools in their arsenal.

Talking about saving the resources, Comet.ml in their press release, had stated that their platform led to the improvement of model training time by 30% irrespective of the underlying infrastructure, and stopped underperforming models automatically, which reduces cost and carbon footprint by 30%.

Irrespective of the underlying infrastructure, it stops underperforming models automatically, which reduces cost and carbon footprint by 30%.

The enterprise offering also includes Comets flagship visualisation engine, which allows users to visualise, explain, and debug model performance and predictions, and a state-of-the-art parameter optimisation engine.

When building any machine learning pipeline, data preparation requires operations like scraping, sampling, joining, and plenty of other approaches. These operations usually accumulate haphazardly and result in what the software engineers would like to call a pipeline jungle.

Now, add in the challenge of forgotten experimental code in the code archives. Things only get worse. The presence of such stale code can malfunction, and an algorithm that runs this malfunctioning code can crash stock markets and self-driving cars. The risks are just too high.

So far, we have seen the use of ML for data-driven solutions. Now the market is ripe for solutions that help those who have already deployed machine learning. It is only a matter of time before we see more companies setting up their meta-learning shops or partner with third-party vendors.

comments

See the original post here:
Is Machine Learning Model Management The Next Big Thing In 2020? - Analytics India Magazine

Posted in Machine Learning | Comments Off on Is Machine Learning Model Management The Next Big Thing In 2020? – Analytics India Magazine

Could Machine Learning Replace the Entire Weather Forecast System? – HPCwire

Just a few months ago, a series of major new weather and climate supercomputing investments were announced, including a 1.2 billion order for the worlds most powerful weather and climate supercomputer and a tripling of the U.S. operational supercomputing capacity for weather forecasting. Weather and climate modeling are among the most power-hungry use cases for supercomputers, and research and forecasting agencies often struggle to keep up with the computing needs of models that are, in many cases, simulating the atmosphere of the entire planet as granularly and as regularly as possible.

What if that all changed?

In a virtual keynote for the HPC-AI Advisory Councils 2020 Stanford Conference, Peter Dueben outlined how machine learning might (or might not) begin to augment and even, eventually, compete with heavy-duty, supercomputer-powered climate models. Dueben is the coordinator for machine learning and AI activities at the European Centre for Medium-Range Weather Forecasts (ECMWF), a UK-based intergovernmental organization that houses two supercomputers and provides 24/7 operational weather services at several timescales. ECMWF is also the home of the Integrated Forecast System (IFS), which Dueben says is probably one of the best forecast models in the world.

Why machine learning at all?

The Earth, Dueben explained, is big. So big, in fact, that apart from being laborious, developing a representational model of the Earths weather and climate systems brick-by-brick isnt achieving the accuracy that you might imagine. Despite the computing firepower behind weather forecasting, most models remain at a 10 kilometer resolution that doesnt represent clouds, and the chaotic atmospheric dynamics and occasionally opaque interactions further complicate model outputs.

However, on the other side, we have a huge number of observations, Dueben said. Just to give you an impression, ECMWF is getting hundreds of millions of observations onto the site every day. Some observations come from satellites, planes, ships, ground measurements, balloons This data collected over the last several decades constituted hundreds of petabytes if simulations and climate modeling results were included.

If you combine those two points, we have a very complex nonlinear system and we also have a lot of data, he said. Theres obviously lots of potential applications for machine learning in weather modeling.

Potential applications of machine learning

Machine learning applications are really spread all over the entire workflow of weather prediction, Dueben said, breaking that workflow down into observations, data assimilation, numerical weather forecasting, and post-processing and dissemination. Across those areas, he explained, machine learning could be used for anything from weather data monitoring to learning the underlying equations of atmospheric motions.

By way of example, Dueben highlighted a handful of current, real-world applications. In one case, researchers had applied machine learning to detecting wildfires caused by lightning. Using observations for 15 variables (such as temperature, soil moisture and vegetation cover), the researchers constructed a machine learning-based decision tree to assess whether or not satellite observations included wildfires. The team achieved an accuracy of 77 percent which, Deuben said, doesnt sound too great in principle, but was actually quite good.

Elsewhere, another team explored the use of machine learning to correct persistent biases in forecast model results. Dueben explained that researchers were examining the use of a weak constraint machine learning algorithm (in this case, 4D-Var), which is a kind of algorithm that would be able to learn this kind of forecast error and correct it in the data assimilation process.

We learn, basically, the bias, he said, and then once we have learned the bias, we can correct the bias of the forecast model by just adding forcing terms to the system. Once 4D-Var was implemented on a sample of forecast model results, the biases were ameliorated. Though Dueben cautioned that the process is still fairly simplistic, a new collaboration with Nvidia is looking into more sophisticated ways of correcting those forecast errors with machine learning.

Dueben also outlined applications in post-processing. Much of modern weather forecasting focuses on ensemble methods, where a model is run many times to obtain a spread of possible scenarios and as a result, probabilities of various outcomes. We investigate whether we can correct the ensemble spread calculated from a small number of ensemble members via deep learning, Dueben said. Once again, machine learning when applied to a ten-member ensemble looking at temperatures in Europe improved the results, reducing error in temperature spreads.

Can machine learning replace core functionality or even the entire forecast system?

One of the things that were looking into is the emulation of different permutation schemes, Dueben said. Chief among those, at least initially, have been the radiation component of forecast models, which account for the fluxes of solar radiation between the ground, the clouds and the upper atmosphere. As a trial run, Dueben and his colleagues are using extensive radiation output data from a forecast model to train a neural network. First of all, its very, very light, Dueben said. Second of all, its also going to be much more portable. Once we represent radiation with a deep neural network, you can basically port it to whatever hardware you want.

Showing a pair of output images, one from the machine learning model and one from the forecast model, Dueben pointed out that it was hard to notice significant differences and even refused to tell the audience which was which. Furthermore, he said, the model had achieved around a tenfold speedup. (Im quite confident that it will actually be much better than a factor of ten, Dueben said.)

Dueben and his colleagues have also scaled their tests up to more ambitious realms. They pulled hourly data on geopotential height (Z500) which is related to air pressure and trained a deep learning model to predict future changes in Z500 across the globe using only that historical data. For this, no physical understanding is really required, Dueben said, and it turns out that its actually working quite well.

Still, Dueben forced himself to face the crucial question.

Is this the future? he asked. I have to say its probably not.

There were several reasons for this. First, Dueben said, the simulations were unstable, eventually blowing up if they were stretched too far. Second of all, he said, its also unknown how to increase complexity at this stage. We only have one field here. Finally, he explained, there were only forty years of sufficiently detailed data with which to work.

Still, it wasnt all pessimism. Its kind of unlikely that its going to fly and basically feed operational forecasting at one point, he said. However, having said this, there are now a number of papers coming out where people are looking into this in a much, much more complicated way than we have done with really sophisticated convolutional networks and they get, actually, quite good results. So who knows!

The path forward

The main challenge for machine learning in the community that were facing at the moment, Dueben said, is basically that we need to prove now that machine learning solutions can really be better than conventional tools and we need to do this in the next couple of years.

There are, of course, many roadblocks to that goal. Forecasting models are extraordinarily complicated; iterations on deep learning models require significant HPC resources to test and validate; and metrics of comparison among models are unclear. Dueben also outlined a series of major unknowns in machine learning for weather forecasting: could our explicit knowledge of atmospheric mechanisms be used to improve a machine learning forecast? Could researchers guarantee reproducibility? Could the tools be scaled effectively to HPC? The list went on.

Many scientists are working on these dilemmas as we speak, Dueben said, and Im sure we will have an enormous amount of progress in the next couple of years. Outlining a path forward, Dueben emphasized a mixture of a top-down and a bottom-up approach to link machine learning with weather and climate models. Per his diagram, this would combine neutral networks based on human knowledge of earth systems with reliable benchmarks, scalability and better uncertainty quantification.

As far as where he sees machine learning for weather prediction in ten years?

It could be that machine learning will have no long-term effect whatsoever that its just a wave going through, Dueben mused. But on the other hand, it could well be that machine learning tools will actually replace almost all conventional models that were working with.

Read the rest here:
Could Machine Learning Replace the Entire Weather Forecast System? - HPCwire

Posted in Machine Learning | Comments Off on Could Machine Learning Replace the Entire Weather Forecast System? – HPCwire

Harnessing the power of GaN and machine learning – News – Compound Semiconductor

Military installations, especially on ships and aircraft, require robust power electronics systems to operate radar and other equipment, but there is limited space onboard. Researchers from the University of Houston will use a $2.5 million grant from the US Department of Defense to develop compact electronic power systems to address the issue.

Harish Krishnamoorthy, assistant professor of electrical and computer engineering and principal investigator for the project, said he will focus on developing power converters using GaN (GaN) devices, capable of quickly storing and discharging energy to operate the radar systems.

He is working with co-PI Kaushik Rajashekara, professor of electrical and computer engineering, and Tagore Technology, a semiconductor company based in Arlington Heights, Ill. The work has potential commercial applications, in addition to military use, he said.

Currently, radar systems require large capacitors, which store energy and provide bursts of power to operate the systems. The electrolytic capacitors also have relatively short lifespans, Krishnamoorthy said.

GaN devices can be turned on and off far more quickly - over ten times as quickly as silicon devices. The resulting higher operating frequency allows passive components in the circuit - including capacitors and inductors - to be designed at much smaller dimensions.

But there are still drawbacks to GaN devices. Noise - electromagnetic interference, or EMI - can affect the precision of radar systems, since the devices work at such high speeds. Part of Krishnamoorthy's project involves designing a system where converters can contain the noise, allowing the radar system to operate unimpeded.

He also will use machine learning to predict the lifespan of GaN devices, as well as of circuits employing these devices. The use of GaN technology in power applications is relatively new, and assessing how long they will continue to operate in a circuit remains a challenge.

"We don't know how long these GaN devices will last in practical applications, because they've only been used for a few years," Krishnamoorthy said. "That's a concern for industry."

The health and well-being of AngelTech speakers, partners, employees and the overall community is our top priority. Due to the growing concern around the coronavirus (COVID-19), and in alignment with the best practices laid out by the CDC, WHO and other relevant entities, AngelTech decided to postpone the live Brussels event to 16th - 18th November 2020.

In the interim, we believe it is still important to connect the community and we want to do this via an online summit, taking place live on Tuesday May 19th at 12:00 GMT and content available for 12 months on demand. This will not replace the live event (we believe live face to face interaction, learning and networking can never be fully replaced by a virtual summit), it will supplement the event, add value for key players and bring the community together digitally.

The event will involve 4 breakout sessions for CS International, PIC International, Sensors International and PIC Pilot Lines.

Key elements of the online summit:

Register to attend

Read the rest here:
Harnessing the power of GaN and machine learning - News - Compound Semiconductor

Posted in Machine Learning | Comments Off on Harnessing the power of GaN and machine learning – News – Compound Semiconductor

This AI tool uses machine learning to detect whether people are social distancing properly – Mashable SE Asia

Perhaps the most important step we can all take to mitigate the spread of the coronavirus, also known as COVID-19, is to actively practice social distancing.

Why? Because the further away you are from another person, the less likely you'll contract or transmit COVID-19.

But when we go about our daily routines, especially when out on a grocery run or heading to the hospital, social distancing can be a challenging task to uphold.

And some of us just have God awful spatial awareness in general.

But how do we monitor and enforce social distancing when looking at a mass population? We resort to the wonders of artificial intelligence (AI), of course.

In a recent blog post, the company demonstrated a nifty social distancing detector that shows a feed of people walking along a street in the Oxford Town Center of the United Kingdom.

The tool encompasses every individual in the feed with a rectangle. When they're properly observing social distancing, that rectangle is green. But when they get too close to another person (less than 6 feet away), the rectangle turns red, accompanied by a line 'linking' the two people that are too close to one another.

On the right-hand side of the tool there's a 'Bird's-Eye View' that allows for monitoring on a bigger scale. Every person is represented by a dot. Working the same way as the rectangles, the dots are green when social distancing is properly adhered to. They turn red when people get too close.

More specifically, work settings like factory floors where physical space is abundant, thus making manual tracking extremely difficult.

According to Landing AI CEO and Founder Andrew Ng, the technology was developed in response to requests by their clients, which includes Foxconn, the main manufacturer of Apple's prized iPhones.

The company also says that this technology can be integrated into existing surveillance cameras. However, it's still exploring ways in which to alert people when they get too close to each other. One possible method is the use of an audible alarm that rings when individuals breach the minimum distance required with other people.

According to Reuters, Amazon already uses a similar machine-learning tool to monitor its employees in their warehouses. In the name of COVID-19 mitigation, companies around the world are grabbing whatever machine-learning AI tools they can get in order to surveil their employees. A lot of these tools tend to be cheap, off-the-shelf iterations that allow employers to watch their employees and listen to phone calls as well.

Landing AI insists that their tool is only for use in work settings, even including a little disclaimer that reads "The rise of computer vision has opened up important questions about privacy and individual rights; our current system does not recognize individuals, and we urge anyone using such a system to do so with transparency and only with informed consent."

Whether companies that make use of this tool adhere to that, we'll never really know.

But we definitely don't want Big Brother to be watching our every move.

Cover image sourced from New Straits Times / AFP.

Link:
This AI tool uses machine learning to detect whether people are social distancing properly - Mashable SE Asia

Posted in Machine Learning | Comments Off on This AI tool uses machine learning to detect whether people are social distancing properly – Mashable SE Asia

Yoshua Bengio: Attention is a core ingredient of conscious AI – VentureBeat

During the International Conference on Learning Representations (ICLR) 2020 this week, which as a result of the pandemic took place virtually, Turing Award winner and director of the Montreal Institute for Learning Algorithms Yoshua Bengio provided a glimpse into the future of AI and machine learning techniques. He spoke in February at the AAAI Conference on Artificial Intelligence 2020 in New York alongside fellow Turing Award recipients Geoffrey Hinton and Yann LeCun. But in a lecture published Monday, Bengio expounded upon some of his earlier themes.

One of those was attention in this context, the mechanism by which a person (or algorithm) focuses on a single element or a few elements at a time. Its central both to machine learning model architectures like Googles Transformer and to the bottleneck neuroscientific theory of consciousness, which suggests that people have limited attention resources, so information is distilled down in the brain to only its salient bits. Models with attention have already achieved state-of-the-art results in domains like natural language processing, and they could form the foundation of enterprise AI that assists employees in a range of cognitively demanding tasks.

Bengio described the cognitive systems proposed by Israeli-American psychologist and economist Daniel Kahneman in his seminal book Thinking, Fast and Slow. The first type is unconscious its intuitive and fast, non-linguistic and habitual, and it deals only with implicit types of knowledge. The second is conscious its linguistic and algorithmic, and it incorporates reasoning and planning, as well as explicit forms of knowledge. An interesting property of the conscious system is that it allows the manipulation of semantic concepts that can be recombined in novel situations, which Bengio noted is a desirable property in AI and machine learning algorithms.

Current machine learning approaches have yet to move beyond the unconscious to the fully conscious, but Bengio believes this transition is well within the realm of possibility. He pointed out that neuroscience research has revealed that the semantic variables involved in conscious thought are often causal they involve things like intentions or controllable objects. Its also now understood that a mapping between semantic variables and thoughts exists like the relationship between words and sentences, for example and that concepts can be recombined to form new and unfamiliar concepts.

GamesBeat Summit 2020 Online | Live Now, Upgrade your pass for networking and speaker Q&A.

Attention is one of the core ingredients in this process, Bengio explained.

Building on this, in a recent paper he and colleagues proposed recurrent independent mechanisms (RIMs), a new model architecture in which multiple groups of cells operate independently, communicating only sparingly through attention. They showed that this leads to specialization among the RIMs, which in turn allows for improved generalization on tasks where some factors of variation differ between training and evaluation.

This allows an agent to adapt faster to changes in a distribution or inference in order to discover reasons why the change happened, said Bengio.

He outlined a few of the outstanding challenges on the road to conscious systems, including identifying ways to teach models to meta-learn (or understand causal relations embodied in data) and tightening the integration between machine learning and reinforcement learning. But hes confident that the interplay between biological and AI research will eventually unlock the key to machines that can reason like humans and even express emotions.

Consciousness has been studied in neuroscience with a lot of progress in the last couple of decades. I think its time for machine learning to consider these advances and incorporate them into machine learning models.

See the original post:
Yoshua Bengio: Attention is a core ingredient of conscious AI - VentureBeat

Posted in Machine Learning | Comments Off on Yoshua Bengio: Attention is a core ingredient of conscious AI – VentureBeat