Search Immortality Topics:

Page 86«..1020..85868788..100110..»


Category Archives: Machine Learning

Elon Musk-backed OpenAI to release text tool it called dangerous – The Guardian

OpenAI, the machine learning nonprofit co-founded by Elon Musk, has released its first commercial product: a rentable version of a text generation tool the organisation once deemed too dangerous to release.

Dubbed simply the API, the new service lets businesses directly access the most powerful version of GPT-3, OpenAIs general purpose text generation AI.

The tool is already a more than capable writer. Feeding an earlier version of the opening line of George Orwells Nineteen Eighty-Four It was a bright cold day in April, and the clocks were striking thirteen the system recognises the vaguely futuristic tone and the novelistic style, and continues with: I was in my car on my way to a new job in Seattle. I put the gas in, put the key in, and then I let it run. I just imagined what the day would be like. A hundred years from now. In 2045, I was a teacher in some school in a poor part of rural China. I started with Chinese history and history of science.

Now, OpenAI wants to put the same power to more commercial uses such as coding and data entry. For instance, if, rather than Orwell, the prompt is a list of the names of six companies and the stock tickers and foundation dates of two of them, the system will finish it by filling in the missing details for the other companies.

It will mark the first commercial uses of a technology which stunned the industry in February 2019 when OpenAI first revealed its progress in teaching a computer to read and write. The group was so impressed by the capability of its new creation that it was initially wary of publishing the full version, warning that it could be misused for ends the nonprofit had not foreseen.

We need to perform experimentation to find out what they can and cant do, said Jack Clark, the groups head of policy, at the time. If you cant anticipate all the abilities of a model, you have to prod it to see what it can do. There are many more people than us who are better at thinking what it can do maliciously.

Now, that fear has lessened somewhat, with almost a year of GPT-2 being available to the public. Still, the company says: The fields pace of progress means that there are frequently surprising new applications of AI, both positive and negative.

We will terminate API access for obviously harmful use-cases, such as harassment, spam, radicalisation, or astroturfing [masking who is behind a message]. But we also know we cant anticipate all of the possible consequences of this technology, so we are launching today in a private beta [test version] rather than general availability.

OpenAI was founded with a $1bn (0.8bn) endowment in 2015, backed by Musk and others, to advance digital intelligence in the way that is most likely to benefit humanity. Musk has since left the board, but remains as a donor.

Follow this link:
Elon Musk-backed OpenAI to release text tool it called dangerous - The Guardian

Posted in Machine Learning | Comments Off on Elon Musk-backed OpenAI to release text tool it called dangerous – The Guardian

The cost of training machines is becoming a problem – The Economist

Jun 11th 2020

THE FUNDAMENTAL assumption of the computing industry is that number-crunching gets cheaper all the time. Moores law, the industrys master metronome, predicts that the number of components that can be squeezed onto a microchip of a given sizeand thus, loosely, the amount of computational power available at a given costdoubles every two years.

For many comparatively simple AI applications, that means that the cost of training a computer is falling, says Christopher Manning, the director of Stanford Universitys AI Lab. But that is not true everywhere. A combination of ballooning complexity and competition means costs at the cutting edge are rising sharply.

Dr Manning gives the example of BERT, an AI language model built by Google in 2018 and used in the firms search engine. It had more than 350m internal parameters and a prodigious appetite for data. It was trained using 3.3bn words of text culled mostly from Wikipedia, an online encyclopedia. These days, says Dr Manning, Wikipedia is not such a large data-set. If you can train a system on 30bn words its going to perform better than one trained on 3bn. And more data means more computing power to crunch it all.

OpenAI, a research firm based in California, says demand for processing power took off in 2012, as excitement around machine learning was starting to build. It has accelerated sharply. By 2018, the computer power used to train big models had risen 300,000-fold, and was doubling every three and a half months (see chart). It should knowto train its own OpenAI Five system, designed to beat humans at Defense of the Ancients 2, a popular video game, it scaled machine learning to unprecedented levels, running thousands of chips non-stop for more than ten months.

Exact figures on how much this all costs are scarce. But a paper published in 2019 by researchers at the University of Massachusetts Amherst estimated that training one version of Transformer, another big language model, could cost as much as $3m. Jerome Pesenti, Facebooks head of AI, says that one round of training for the biggest models can cost millions of dollars in electricity consumption.

Facebook, which turned a profit of $18.5bn in 2019, can afford those bills. Those less flush with cash are feeling the pinch. Andreessen Horowitz, an influential American venture-capital firm, has pointed out that many AI startups rent their processing power from cloud-computing firms like Amazon and Microsoft. The resulting billssometimes 25% of revenue or moreare one reason, it says, that AI startups may make for less attractive investments than old-style software companies. In March Dr Mannings colleagues at Stanford, including Fei-Fei Li, an AI luminary, launched the National Research Cloud, a cloud-computing initiative to help American AI researchers keep up with spiralling bills.

The growing demand for computing power has fuelled a boom in chip design and specialised devices that can perform the calculations used in AI efficiently. The first wave of specialist chips were graphics processing units (GPUs), designed in the 1990s to boost video-game graphics. As luck would have it, GPUs are also fairly well-suited to the sort of mathematics found in AI.

Further specialisation is possible, and companies are piling in to provide it. In December, Intel, a giant chipmaker, bought Habana Labs, an Israeli firm, for $2bn. Graphcore, a British firm founded in 2016, was valued at $2bn in 2019. Incumbents such as Nvidia, the biggest GPU-maker, have reworked their designs to accommodate AI. Google has designed its own tensor-processing unit (TPU) chips in-house. Baidu, a Chinese tech giant, has done the same with its own Kunlun chips. Alfonso Marone at KPMG reckons the market for specialised AI chips is already worth around $10bn, and could reach $80bn by 2025.

Computer architectures need to follow the structure of the data theyre processing, says Nigel Toon, one of Graphcores co-founders. The most basic feature of AI workloads is that they are embarrassingly parallel, which means they can be cut into thousands of chunks which can all be worked on at the same time. Graphcores chips, for instance, have more than 1,200 individual number-crunching cores, and can be linked together to provide still more power. Cerebras, a Californian startup, has taken an extreme approach. Chips are usually made in batches, with dozens or hundreds etched onto standard silicon wafers 300mm in diameter. Each of Cerebrass chips takes up an entire wafer by itself. That lets the firm cram 400,000 cores onto each.

Other optimisations are important, too. Andrew Feldman, one of Cerebrass founders, points out that AI models spend a lot of their time multiplying numbers by zero. Since those calculations always yield zero, each one is unnecessary, and Cerebrass chips are designed to avoid performing them. Unlike many tasks, says Mr Toon at Graphcore, ultra-precise calculations are not needed in AI. That means chip designers can save energy by reducing the fidelity of the numbers their creations are juggling. (Exactly how fuzzy the calculations can get remains an open question.)

All that can add up to big gains. Mr Toon reckons that Graphcores current chips are anywhere between ten and 50 times more efficient than GPUs. They have already found their way into specialised computers sold by Dell, as well as into Azure, Microsofts cloud-computing service. Cerebras has delivered equipment to two big American government laboratories.

Moores law isnt possible any more

Such innovations will be increasingly important, for the AIfuelled explosion in demand for computer power comes just as Moores law is running out of steam. Shrinking chips is getting harder, and the benefits of doing so are not what they were. Last year Jensen Huang, Nvidias founder, opined bluntly that Moores law isnt possible any more.

Other researchers are therefore looking at more exotic ideas. One is quantum computing, which uses the counter-intuitive properties of quantum mechanics to provide big speed-ups for some sorts of computation. One way to think about machine learning is as an optimisation problem, in which a computer is trying to make trade-offs between millions of variables to arrive at a solution that minimises as many as possible. A quantum-computing technique called Grovers algorithm offers big potential speed-ups, says Krysta Svore, who leads the Quantum Architectures and Computation Group at Microsoft Research.

Another idea is to take inspiration from biology, which proves that current brute-force approaches are not the only way. Cerebrass chips consume around 15kW when running flat-out, enough to power dozens of houses (an equivalent number of GPUs consumes many times more). A human brain, by contrast, uses about 20W of energyabout a thousandth as muchand is in many ways cleverer than its silicon counterpart. Firms such as Intel and IBM are therefore investigating neuromorphic chips, which contain components designed to mimic more closely the electrical behaviour of the neurons that make up biological brains.

For now, though, all that is far off. Quantum computers are relatively well-understood in theory, but despite billions of dollars in funding from tech giants such as Google, Microsoft and IBM, actually building them remains an engineering challenge. Neuromorphic chips have been built with existing technologies, but their designers are hamstrung by the fact that neuroscientists still do not understand what exactly brains do, or how they do it.

That means that, for the foreseeable future, AI researchers will have to squeeze every drop of performance from existing computing technologies. Mr Toon is bullish, arguing that there are plenty of gains to be had from more specialised hardware and from tweaking existing software to run faster. To quantify the nascent fields progress, he offers an analogy with video games: Were past Pong, he says. Were maybe at Pac-Man by now. All those without millions to spend will be hoping he is right.

This article appeared in the Technology Quarterly section of the print edition under the headline "Machine, learning"

Read more here:
The cost of training machines is becoming a problem - The Economist

Posted in Machine Learning | Comments Off on The cost of training machines is becoming a problem – The Economist

What Are DPUs And Why Do We Need Them – Analytics India Magazine

We have heard of CPUs and TPUs, now, NVIDIA with the help of its recent acquisition Mellanox is bringing a new class of processors to power up deep learning applications DPUs or data processing units.

DPUs or Data Processing Units, originally popularised by Mellanox, now wear a new look with NVIDIA; Mellanox was acquired by NVIDIA earlier this year. DPUs are a new class of programmable processor that consists of flexible and programmable acceleration engines which improve applications performance for AI and machine learning, security, telecommunications, storage, among others.

The team at Mellanox has already deployed the first generation of BlueField DPUs in leading high-performance computing, deep learning, and cloud data centres to provide new levels of performance, scale, and efficiency with improved operational agility.

The improvement in performance is due to the presence of high-performance, software programmable, multi-core CPU and a network interface capable of parsing, processing, and efficiently transferring data at line rate to GPUs and CPUs.

According to NVIDIA, a DPU can be used as a stand-alone embedded processor. DPUs are usually incorporated into a SmartNIC, a network interface controller. SmartNICs are ideally suited for high-traffic web servers.

A DPU based SmartNIC is a network interface card that offloads processing tasks that the system CPU would normally handle. Using its own on-board processor, the DPU based SmartNIC may be able to perform any combination of encryption/decryption, firewall, TCP/IP and HTTP processing.

The CPU is for general-purpose computing, the GPU is for accelerated computing and the DPU, which moves data around the data centre, does data processing.

These DPUs are known by the name of BlueField that have a unique design that can enable programmability to run at speeds of up to 200Gb/s. The BlueField DPU integrates the NVIDIA Mellanox Connect best-in-class network adapter, encompassing hardware accelerators with advanced software programmability to deliver diverse software-defined solutions.

Organisations that rely on cloud-based solutions, especially can benefit immensely from DPUs. Here are few such instances, where DPUs flourish:

Bare metal environment is a network where a virtual machine is installed

The shift towards microservices architecture has completely transformed the way enterprises ship applications at scale. Applications that are based on the cloud have a lot of activity or data generation, even for processing a single application request. According to Mellanox, one key application of DPU is securing the cloud-native workloads.

For instance, Kubernetes security is an immense challenge comprising many highly interrelated parts. The data intensity makes it hard to implement zero-trust security solutions, and this creates challenges for the security team to protect customers data and privacy.

As of late last year, the team at Mellanox stated that they are actively researching into various platforms and integrating schemes to leverage the cutting-edge acceleration engines in the DPU-based SmartNICs for securing cloud-native workloads at 100Gb/s.

According to NVIDIA, a DPU comes with the following features:

Know more about DPUs here.

comments

Read more:
What Are DPUs And Why Do We Need Them - Analytics India Magazine

Posted in Machine Learning | Comments Off on What Are DPUs And Why Do We Need Them – Analytics India Magazine

Machine Learning Market 2020 Professional Survey Report; Industry Growth, Shares, Opportunities And Forecast To 2026 – Surfacing Magazine

Machine Learning Market research Report is a valuable supply of perceptive information for business strategists. This Machine Learning Market study provides comprehensive data which enhances the understanding, scope and application of this report.

Summary of Report @ Machine Learning Market

A thorough study of the competitive landscape of the global Machine Learning Market has been given, presenting insights into the company profiles, financial status, recent developments, mergers and acquisitions, and the SWOT analysis. This research report will give a clear idea to readers about the overall market scenario to further decide on this market projects.

The analysts also have analyzed drawbacks with on-going Machine Learning trends and the opportunities which are devoting to the increased growth of the market. International Machine Learning market research report provides the perspective of this competitive landscape of worldwide markets. The report offers particulars that originated from the analysis of the focused market. Also, it targets innovative, trends, shares and cost by Machine Learning industry experts to maintain a consistent investigation.

Market Segment by Regions, regional analysis covers

The Machine Learning analysis was made to include both qualitative and qualitative facets of this market in regards to global leading regions. The Machine Learning report also reinforces the information concerning the aspects like major Machine Learning drivers & controlling facets that may specify the markets. Also, covering multiple sections, company profile, and type, along with applications.

We do provide Sample of this report, Please go through the following information in order to Request Sample Copy.

This report sample includes:

Brief Introduction to the research report.

Table of Contents (Scope covered as a part of the study)Top players in the market

Research framework (Structure Of The Report)

Research methodology adopted by Coherent Market Insights

Get Sample copy @ https://www.coherentmarketinsights.com/insight/request-sample/1098

Reasons why you should buy this report

Understand the current and future of the Machine Learning Market in both developed and emerging markets.

The report assists in realigning the business strategies by highlighting the Machine Learning business priorities.

The report throws light on the segment expected to dominate the Machine Learning industry and market.

Forecasts the regions expected to witness the fastest growth.

The latest developments in the Machine Learning industry and details of the industry leaders along with their market share and strategies.

Saves time on the entry level analysis because the report contains very important info regarding growth, size, leading players and segments of the business.

Save and reduce time carrying out entry-level research by identifying the growth, size, leading players and segments in the global Market.

The global report is integrated considering the primary and secondary research methodologies that have been collected from reliable sources intended to generate a factual database. The data from market journals, publications, conferences, white papers and interviews of key market leaders are compiled to generate our segmentation and is mapped to a fair trajectory of the market during the forecast period.

Request Discountoption enables you to get the discounts on the actual price of the report. Kindly fill the form, and one of our consultants would get in touch with you to discuss your allocated budget, and would provide discounts.

Dont Quarantine Your Research, you keep your social distance and we provide you a socialDISCOUNTuseSTAYHOMECode in precise requirement andGetFLAT 1000USD OFFon all CMI reports

Request for Discount @ https://www.coherentmarketinsights.com/insight/request-discount/1098

Market Drivers and Restraints:

Emergence of new technologies in Enterprise Mobility

Economies of Scale in the Operational Expenditure

Lack of Training Expertise and Skills

Data Security concerns

Key highlights of this report:

Overview of key market forces driving and restraining the market growth

Market and Forecast (2018 2026)

analyses of market trends and technological improvements

analyses of market competition dynamics to offer you a competitive edge

An analysis of strategies of major competitors

Workplace Transformation Services market Volume and Forecast (2018 2026)

Companies Market Share Analysis

analysis of major industry segments

Detailed analyses of industry trends

Offers a clear understanding of the competitive landscape and key product segments

About Coherent Market Insights:

Coherent Market Insights is a prominent market research and consulting firm offering action-ready syndicated research reports, custom market analysis, consulting services, and competitive analysis through various recommendations related to emerging market trends, technologies, and potential absolute dollar opportunity.

Contact Us:

Mr. ShahCoherent Market Insights1001 4th Ave,#3200Seattle, WA 98154Tel: +1-206-701-6702Email:sales@coherentmarketinsights.comVisit Here, for More Information:https://theemmasblog.blogspot.com/

Link:
Machine Learning Market 2020 Professional Survey Report; Industry Growth, Shares, Opportunities And Forecast To 2026 - Surfacing Magazine

Posted in Machine Learning | Comments Off on Machine Learning Market 2020 Professional Survey Report; Industry Growth, Shares, Opportunities And Forecast To 2026 – Surfacing Magazine

What is machine learning, and how does it work? – Pew Research Center

At Pew Research Center, we collect and analyze data in a variety of ways. Besides asking people what they think through surveys, we also regularly study things like images, videos and even the text of religious sermons.

In a digital world full of ever-expanding datasets like these, its not always possible for humans to analyze such vast troves of information themselves. Thats why our researchers have increasingly made use of a method called machine learning. Broadly speaking, machine learning uses computer programs to identify patterns across thousands or even millions of data points. In many ways, these techniques automate tasks that researchers have done by hand for years.

Our latest video explainer part of our Methods 101 series explains the basics of machine learning and how it allows researchers at the Center to analyze data on a large scale. To learn more about how weve used machine learning and other computational methods in our research, including the analysis mentioned in this video, you can explore recent reports from our Data Labs team.

The rest is here:
What is machine learning, and how does it work? - Pew Research Center

Posted in Machine Learning | Comments Off on What is machine learning, and how does it work? – Pew Research Center

How to choose between rule-based AI and machine learning – TechTalks

By Elana Krasner

Companies across industries are exploring and implementing artificial intelligence (AI) projects, from big data to robotics, to automate business processes, improve customer experience, and innovate product development. According to McKinsey, embracing AI promises considerable benefits for businesses and economies through its contributions to productivity and growth. But with that promise comes challenges.

Computers and machines dont come into this world with inherent knowledge or an understanding of how things work. Like humans, they need to be taught that a red light means stop and green means go. So, how do these machines actually gain the intelligence they need to carry out tasks like driving a car or diagnosing a disease?

There are multiple ways to achieve AI, and existential to them all is data. Without quality data, artificial intelligence is a pipedream. There are two ways data can be manipulatedeither through rules or machine learningto achieve AI, and some best practices to help you choose between the two methods.

Long before AI and machine learning (ML) became mainstream terms outside of the high-tech field, developers were encoding human knowledge into computer systems as rules that get stored in a knowledge base. These rules define all aspects of a task, typically in the form of If statements (if A, then do B, else if X, then do Y).

While the number of rules that have to be written depends on the number of actions you want a system to handle (for example, 20 actions means manually writing and coding at least 20 rules), rules-based systems are generally lower effort, more cost-effective and less risky since these rules wont change or update on their own. However, rules can limit AI capabilities with rigid intelligence that can only do what theyve been written to do.

While a rules-based system could be considered as having fixed intelligence, in contrast, a machine learning system is adaptive and attempts to simulate human intelligence. There is still a layer of underlying rules, but instead of a human writing a fixed set, the machine has the ability to learn new rules on its own, and discard ones that arent working anymore.

In practice, there are several ways a machine can learn, but supervised trainingwhen the machine is given data to train onis generally the first step in a machine learning program. Eventually, the machine will be able to interpret, categorize, and perform other tasks with unlabeled data or unknown information on its own.

The anticipated benefits to AI are high, so the decisions a company makes early in its execution can be critical to success. Foundational is aligning your technology choices to the underlying business goals that AI was set forth to achieve. What problems are you trying to solve, or challenges are you trying to meet?

The decision to implement a rules-based or machine learning system will have a long-term impact on how a companys AI program evolves and scales. Here are some best practices to consider when evaluating which approach is right for your organization:

When choosing a rules-based approach makes sense:

When to apply machine learning:

The promises of AI are real, but for many organizations, the challenge is where to begin. If you fall into this category, start by determining whether a rules-based or ML method will work best for your organization.

About the author:

Elana Krasner is Product Marketing Manager at 7Park Data, a data and analytics company that transforms raw data into analytics-ready products using machine learning and NLP technologies. She has been in the tech marketing field for almost 10 years and has worked across the industry in Cloud Computing, SaaS and Data Analytics.

Read the rest here:
How to choose between rule-based AI and machine learning - TechTalks

Posted in Machine Learning | Comments Off on How to choose between rule-based AI and machine learning – TechTalks