The Future Of Nano Technology
- Alan Watts
- Anti-Aging Medicine
- David Sinclair
- Gene Medicine
- Gene therapy
- Genetic Medicine
- Genetic Therapy
- Global News Feed
- Hormone Replacement Therapy
- Human Genetic Engineering
- Human Reproduction
- Integrative Medicine
- Life Skills
- Longevity Medicine
- Machine Learning
- Medical School
- Nano Medicine
- Parkinson's disease
- Quantum Computing
- Regenerative Medicine
- Stem Cell Therapy
- Stem Cells
- NeuBase Therapeutics Reports Financial Results for the Second Quarter of Fiscal Year 2021 – GlobeNewswire
- NeuBase Therapeutics Appoints Gerald J. McDougall to Board of Directors – BioSpace
- Biogen looks to build better gene therapies through latest deal – BioPharma Dive
- From one genomic diagnosis, researchers discover other treatable health conditions – National Human Genome Research Institute
- Gene and Cell Therapy Breakthroughs Focus of World Medical – GlobeNewswire
- vagina innie outie comparison
- jackson avery and aprils child
- jackson and aprils daughters name
- the numbness not going away after teeth extraction
- innie vs outie labia
- greys anatomy ortiz
- dr ortiz greys anatomy
- jaw numbnesd after a tooth extraction
- greys anatomy cast dr ortiz
- mother daughter interna on greys amatomy
|Search Immortality Topics:|
Category Archives: Machine Learning
As companies step up the use of machine learning-enabled systems in their day-to-day operations, they become increasingly reliant on those systems to help them make critical business decisions. In some cases, the machine learning systems operate autonomously, making it especially important that the automated decision-making works as intended.
However, machine learning-based systems are only as good as the data that's used to train them. If there are inherent biases in the data used to feed a machine learning algorithm, the result could be systems that are untrustworthy and potentially harmful.
In this article, you'll learn why bias in AI systems is a cause for concern, how to identify different types of biases and six effective methods for reducing bias in machine learning.
The power of machine learning comes from its ability to learn from data and apply that learning experience to new data the systems have never seen before. However, one of the challenges data scientists have is ensuring that the data that's fed into machine learning algorithms is not only clean, accurate and -- in the case of supervised learning, well-labeled -- but also free of any inherently biased data that can skew machine learning results.
The power of supervised learning, one of the core approaches to machine learning, in particular depends heavily on the quality of the training data. So it should be no surprise that when biased training data is used to teach these systems, the results are biased AI systems. Biased AI systems that are put into implementation can cause problems, especially when used in automated decision-making systems, autonomous operation, or facial recognition software that makes predictions or renders judgment on individuals.
Some notable examples of the bad outcomes caused by algorithmic bias include: a Google image recognition system that misidentified images of minorities in an offensive way; automated credit applications from Goldman Sachs that have sparked an investigation into gender bias; and a racially biased AI program used to sentence criminals. Enterprises must be hyper-vigilant about machine learning bias: Any value delivered by AI and machine learning systems in terms of efficiency or productivity will be wiped out if the algorithms discriminate against individuals and subsets of the population.
However, AI bias is not only limited to discrimination against individuals. Biased data sets can jeopardize business processes when applied to objects and data of all types. For example, take a machine learning model that was trained to recognize wedding dresses. If the model was trained using Western data, then wedding dresses would be categorized primarily by identifying shades of white. This model would fail in non-Western countries where colorful wedding dresses are more commonly accepted. Errors also abound where data sets have bias in terms of the time of day when data was collected, the condition of the data and other factors.
All of the examples described above represent some sort of bias that was introduced by humans as part of their data selection and identification methods for training the machine learning model. Because the systems technologists build are necessarily colored by their own experiences, they must be very aware that their individual biases can jeopardize the quality of the training data. Individual bias, in turn, can easily become a systemic bias as bad predictions and unfair outcomes are automated.
Part of the challenge of identifying bias is due to the difficulty of seeing how some machine learning algorithms generalize their learning from the training data. In particular, deep learning algorithms have proven to be remarkably powerful in their capabilities. This approach to neural networks leverages large quantities of data, high performance compute power and a sophisticated approach to efficiency, resulting in machine learning models with profound abilities.
Deep learning, however, is a "black box." It's not clear how an individual decision was arrived at by the neural network predictive model. You can't simply query the system and determine with precision which inputs resulted in which outputs. This makes it hard to spot and eliminate potential biases when they arise in the results. Researchers are increasingly turning their focus on adding explainability to neural networks. Verification is the process of proving the properties of neural networks. However, because of the size of neural networks, it can be hard to check them for bias.
Until we have truly explainable systems, we must understand how to recognize and measure AI bias in machine learning models. Some of the biases in the data sets arise from the selection of training data sets. The model needs to represent the data as it exists in the real world. If your data set is artificially constrained to a subset of the population, you will get skewed results in the real world, even if it performs very well against training data. Likewise, data scientists must take care in how they select which data to include in a training data set and which features or dimensions are included in the data for machine learning training.
Companies are combating inherent data bias by implementing programs to not only broaden the diversity of their data sets, but also the diversity of their teams. More diversity on teams means that people of many perspectives and varied experiences are feeding systems the data points to learn from. Unfortunately, the tech industry today is very homogeneous; there are not many women or people of color in the field. Efforts to diversify teams should also have a positive impact on the machine learning models produced, since data science teams will be better able to understand the requirements for more representative data sets.
There are a few sources for the bias that can have an adverse impact on machine learning models. Some of these are represented in the data that is collected and others in the methods used to sample, aggregate, filter and enhance that data.
There are no doubt other types of bias that might be represented in the data set than just the ones listed above, and all those forms should be identified early in the machine learning project.
1. Identify potential sources of bias. Using the above sources of bias as a guide, one way to address and mitigate bias is to examine the data and see how the different forms of bias could impact the data being used to train the machine learning model. Have you selected the data without bias? Have you made sure there isn't any bias arising from errors in data capture or observation? Are you making sure not to use an historic data set tainted with prejudice or confirmation bias? By asking these questions you can help to identify and potentially eliminate that bias.
2. Set guidelines and rules for eliminating bias and procedures. To keep bias in check, organizations should set guidelines, rules and procedures for identifying, communicating and mitigating potential data set bias. Forward-thinking organizations are documenting cases of bias as they occur, outlining the steps taken to identify bias, and explaining the efforts taken to mitigate bias. By establishing these rules and communicating them in an open, transparent manner, organizations can put the right foot forward to address issues of machine learning model bias.
3. Identify accurate representative data. Prior to collecting and aggregating data for machine learning model training, organizations should first try to understand what a representative data set should look like. Data scientists should use their data analysis skills to understand the nature of the population that is to be modeled along with the characteristics of the data used to create the machine learning model. These two things should match in order to build a data set with as little bias as possible.
4. Document and share how data is selected and cleansed. Many forms of bias occur when selecting data from among large data sets and during data cleansing operations. In order to make sure few bias-inducing mistakes are made, organizations should document their methods of data selection and cleansing and allow others to examine when and if the models exhibit any form of bias. Transparency allows for root-cause analysis of sources of bias to be eliminated in future model iterations.
5. Evaluate model for performance and select least-biased, in addition to performance. Machine learning models are often evaluated prior to being placed into operation. Most of the time these evaluation steps focus on aspects of model accuracy and precision. Organizations should also add measures of bias detection in their model evaluation steps. Even if the model performs with certain levels of accuracy and precision for particular tasks, it could fail on measures of bias, which might point to issues with the training data.
6. Monitor and review models in operation. Finally, there is a difference between how the machine learning model performs in training and how it performs in the real world. Organizations should provide methods to monitor and continuously review the models as they perform in operation. If there are signs that certain forms of bias are showing up in the results, then the organization can take action before the bias causes irreparable harm.
When bias becomes embedded in machine learning models, it can have an adverse impact on our daily lives. The bias is exhibited in the form of exclusion, such as certain groups being denied loans or not being able to use the technology, or in the technology not working the same for everyone. As AI continues to become more a part of our lives, the risks from bias only grow larger. Companies, researchers and developers have a responsibility to minimize bias in AI systems. A lot of it comes down to ensuring that the data sets are representative and that the interpretation of data sets is correctly understood. However, just making sure that the data sets aren't biased won't actually remove bias, so having diverse teams of people working toward the development of AI remains an important goal for enterprises.
Read more from the original source:
6 ways to reduce different types of bias in machine learning - TechTarget
Data Science and Machine Learning Service Market Size 2020 Application, Trends, Growth, Opportunities and Worldwide Forecast to 2025 – 3rd Watch News
The latest report on Data Science and Machine Learning Service Industry market now available at MarketStudyReport.com, delivers facts and numbers regarding the market size, geographical landscape and profit forecast of the Data Science and Machine Learning Service Industry market. In addition, the report focuses on major obstacles and the latest growth plans adopted by leading companies in this business.
The Data Science and Machine Learning Service Industry market report presents a comprehensive assessment of this industry vertical and comprises of significant insights pertaining to the current as well as anticipated situation of the marketplace over the forecast period. Key industry trends which are impacting the Data Science and Machine Learning Service Industry market are also mentioned in the report. The document delivers information about industry policies, regional spectrum and other parameters including the impact of the current industry scenario on investors.
Request a sample Report of Data Science and Machine Learning Service Industry Market at:https://www.marketstudyreport.com/request-a-sample/2706049
The report on Data Science and Machine Learning Service Industry market evaluates the advantages and the disadvantages of company products as well as provides with an overview of the competitive scenario. Significant data regarding the raw material and the downstream buyers is provided in the report.
Revealing information concerning the Data Science and Machine Learning Service Industry market competitive terrain:
Important data regarding the Data Science and Machine Learning Service Industry market regional landscape:
Ask for Discount on Data Science and Machine Learning Service Industry Market Report at:https://www.marketstudyreport.com/check-for-discount/2706049
Other takeaways from the Data Science and Machine Learning Service Industry market report:
For More Details On this Report: https://www.marketstudyreport.com/reports/covid-19-outbreak-global-data-science-and-machine-learning-service-industry-market-report-development-trends-threats-opportunities-and-competitive-landscape-in-2020
1. COVID-19 Outbreak-Global Compliance Software Industry Market Report-Development Trends, Threats, Opportunities and Competitive Landscape in 2020Read More: https://www.marketstudyreport.com/reports/covid-19-outbreak-global-compliance-software-industry-market-report-development-trends-threats-opportunities-and-competitive-landscape-in-2020
2. COVID-19 Outbreak-Global Optical Connectors Industry Market Report-Development Trends, Threats, Opportunities and Competitive Landscape in 2020Read More: https://www.marketstudyreport.com/reports/covid-19-outbreak-global-optical-connectors-industry-market-report-development-trends-threats-opportunities-and-competitive-landscape-in-2020
Contact Us:Corporate Sales,Market Study Report LLCPhone: 1-302-273-0910Toll Free: 1-866-764-2150 Email: [emailprotected]
Machine Learning Chips Market Growth by Top Companies, Trends by Types and Application, Forecast to 2026 – 3rd Watch News
Los Angeles, United State: QY Research recently published a research report titled, Global Machine Learning Chips Market Research Report 2020-2026. The research report attempts to give a holistic overview of the Machine Learning Chips market by keeping the information simple, relevant, accurate, and to the point. The researchers have explained each aspect of the market thoroughmeticulous research and undivided attention to every topic. They have also provided data in statistical data to help readers understand the whole market. The Machine Learning Chips Market report further provides historic and forecast data generated through primary and secondary research of the region and their respective manufacturers.
Get Full PDF Sample Copy of Report: (Including Full TOC, List of Tables & Figures, Chart) https://www.qyresearch.com/sample-form/form/1839774/global-machine-learning-chips-market
Global Machine Learning Chips Market report section gives special attention to the manufacturers in different regions that are expected to show a considerable expansion in their market share. Additionally, it underlines all the current and future trends that are being adopted by these manufacturers to boost their current market shares. This Machine Learning Chips Market report Understanding the various strategies being carried out by various manufacturers will help reader make right business decisions.
Key Players Mentioned in the Global Machine Learning Chips Market Research Report: Wave Computing, Graphcore, Google Inc, Intel Corporation, IBM Corporation, Nvidia Corporation, Qualcomm, Taiwan Semiconductor Manufacturing Machine Learning Chips
Global Machine Learning Chips Market Segmentation by Product: Neuromorphic Chip, Graphics Processing Unit (GPU) Chip, Flash Based Chip, Field Programmable Gate Array (FPGA) Chip, Other Machine Learning Chips
Global Machine Learning Chips Market Segmentation by Application: , Robotics Industry, Consumer Electronics, Automotive, Healthcare, Other
The Machine Learning Chips market is divided into the two important segments, product type segment and end user segment. In the product type segment it lists down all the products currently manufactured by the companies and their economic role in the Machine Learning Chips market. It also reports the new products that are currently being developed and their scope. Further, it presents a detailed understanding of the end users that are a governing force of the Machine Learning Chips market.
In this chapter of the Machine Learning Chips Market report, the researchers have explored the various regions that are expected to witness fruitful developments and make serious contributions to the markets burgeoning growth. Along with general statistical information, the Machine Learning Chips Market report has provided data of each region with respect to its revenue, productions, and presence of major manufacturers. The major regions which are covered in the Machine Learning Chips Market report includes North America, Europe, Central and South America, Asia Pacific, South Asia, the Middle East and Africa, GCC countries, and others.
Key questions answered in the report:
Get Complete Report in your inbox within 24 hours at USD( 4900): https://www.qyresearch.com/settlement/pre/63325f365f710a0813535ce285714216,0,1,global-machine-learning-chips-market
Table of Content
1 Study Coverage1.1 Machine Learning Chips Product Introduction1.2 Key Market Segments in This Study1.3 Key Manufacturers Covered: Ranking of Global Top Machine Learning Chips Manufacturers by Revenue in 20191.4 Market by Type1.4.1 Global Machine Learning Chips Market Size Growth Rate by Type1.4.2 Neuromorphic Chip1.4.3 Graphics Processing Unit (GPU) Chip1.4.4 Flash Based Chip1.4.5 Field Programmable Gate Array (FPGA) Chip1.4.6 Other1.5 Market by Application1.5.1 Global Machine Learning Chips Market Size Growth Rate by Application1.5.2 Robotics Industry1.5.3 Consumer Electronics1.5.4 Automotive1.5.5 Healthcare1.5.6 Other1.6 Study Objectives1.7 Years Considered 2 Executive Summary2.1 Global Machine Learning Chips Market Size, Estimates and Forecasts2.1.1 Global Machine Learning Chips Revenue Estimates and Forecasts 2015-20262.1.2 Global Machine Learning Chips Production Capacity Estimates and Forecasts 2015-20262.1.3 Global Machine Learning Chips Production Estimates and Forecasts 2015-20262.2 Global Machine Learning Chips, Market Size by Producing Regions: 2015 VS 2020 VS 20262.3 Analysis of Competitive Landscape2.3.1 Manufacturers Market Concentration Ratio (CR5 and HHI)2.3.2 Global Machine Learning Chips Market Share by Company Type (Tier 1, Tier 2 and Tier 3)2.3.3 Global Machine Learning Chips Manufacturers Geographical Distribution2.4 Key Trends for Machine Learning Chips Markets & Products2.5 Primary Interviews with Key Machine Learning Chips Players (Opinion Leaders) 3 Market Size by Manufacturers3.1 Global Top Machine Learning Chips Manufacturers by Production Capacity3.1.1 Global Top Machine Learning Chips Manufacturers by Production Capacity (2015-2020)3.1.2 Global Top Machine Learning Chips Manufacturers by Production (2015-2020)3.1.3 Global Top Machine Learning Chips Manufacturers Market Share by Production3.2 Global Top Machine Learning Chips Manufacturers by Revenue3.2.1 Global Top Machine Learning Chips Manufacturers by Revenue (2015-2020)3.2.2 Global Top Machine Learning Chips Manufacturers Market Share by Revenue (2015-2020)3.2.3 Global Top 10 and Top 5 Companies by Machine Learning Chips Revenue in 20193.3 Global Machine Learning Chips Price by Manufacturers3.4 Mergers & Acquisitions, Expansion Plans 4 Machine Learning Chips Production by Regions4.1 Global Machine Learning Chips Historic Market Facts & Figures by Regions4.1.1 Global Top Machine Learning Chips Regions by Production (2015-2020)4.1.2 Global Top Machine Learning Chips Regions by Revenue (2015-2020)4.2 North America4.2.1 North America Machine Learning Chips Production (2015-2020)4.2.2 North America Machine Learning Chips Revenue (2015-2020)4.2.3 Key Players in North America4.2.4 North America Machine Learning Chips Import & Export (2015-2020)4.3 Europe4.3.1 Europe Machine Learning Chips Production (2015-2020)4.3.2 Europe Machine Learning Chips Revenue (2015-2020)4.3.3 Key Players in Europe4.3.4 Europe Machine Learning Chips Import & Export (2015-2020)4.4 China4.4.1 China Machine Learning Chips Production (2015-2020)4.4.2 China Machine Learning Chips Revenue (2015-2020)4.4.3 Key Players in China4.4.4 China Machine Learning Chips Import & Export (2015-2020)4.5 Japan4.5.1 Japan Machine Learning Chips Production (2015-2020)4.5.2 Japan Machine Learning Chips Revenue (2015-2020)4.5.3 Key Players in Japan4.5.4 Japan Machine Learning Chips Import & Export (2015-2020)4.6 South Korea4.6.1 South Korea Machine Learning Chips Production (2015-2020)4.6.2 South Korea Machine Learning Chips Revenue (2015-2020)4.6.3 Key Players in South Korea4.6.4 South Korea Machine Learning Chips Import & Export (2015-2020) 5 Machine Learning Chips Consumption by Region5.1 Global Top Machine Learning Chips Regions by Consumption5.1.1 Global Top Machine Learning Chips Regions by Consumption (2015-2020)5.1.2 Global Top Machine Learning Chips Regions Market Share by Consumption (2015-2020)5.2 North America5.2.1 North America Machine Learning Chips Consumption by Application5.2.2 North America Machine Learning Chips Consumption by Countries5.2.3 U.S.5.2.4 Canada5.3 Europe5.3.1 Europe Machine Learning Chips Consumption by Application5.3.2 Europe Machine Learning Chips Consumption by Countries5.3.3 Germany5.3.4 France5.3.5 U.K.5.3.6 Italy5.3.7 Russia5.4 Asia Pacific5.4.1 Asia Pacific Machine Learning Chips Consumption by Application5.4.2 Asia Pacific Machine Learning Chips Consumption by Regions5.4.3 China5.4.4 Japan5.4.5 South Korea5.4.6 India5.4.7 Australia5.4.8 Taiwan5.4.9 Indonesia5.4.10 Thailand5.4.11 Malaysia5.4.12 Philippines5.4.13 Vietnam5.5 Central & South America5.5.1 Central & South America Machine Learning Chips Consumption by Application5.5.2 Central & South America Machine Learning Chips Consumption by Country5.5.3 Mexico5.5.3 Brazil5.5.3 Argentina5.6 Middle East and Africa5.6.1 Middle East and Africa Machine Learning Chips Consumption by Application5.6.2 Middle East and Africa Machine Learning Chips Consumption by Countries5.6.3 Turkey5.6.4 Saudi Arabia5.6.5 U.A.E 6 Market Size by Type (2015-2026)6.1 Global Machine Learning Chips Market Size by Type (2015-2020)6.1.1 Global Machine Learning Chips Production by Type (2015-2020)6.1.2 Global Machine Learning Chips Revenue by Type (2015-2020)6.1.3 Machine Learning Chips Price by Type (2015-2020)6.2 Global Machine Learning Chips Market Forecast by Type (2021-2026)6.2.1 Global Machine Learning Chips Production Forecast by Type (2021-2026)6.2.2 Global Machine Learning Chips Revenue Forecast by Type (2021-2026)6.2.3 Global Machine Learning Chips Price Forecast by Type (2021-2026)6.3 Global Machine Learning Chips Market Share by Price Tier (2015-2020): Low-End, Mid-Range and High-End 7 Market Size by Application (2015-2026)7.2.1 Global Machine Learning Chips Consumption Historic Breakdown by Application (2015-2020)7.2.2 Global Machine Learning Chips Consumption Forecast by Application (2021-2026) 8 Corporate Profiles8.1 Wave Computing8.1.1 Wave Computing Corporation Information8.1.2 Wave Computing Overview8.1.3 Wave Computing Production Capacity and Supply, Price, Revenue and Gross Margin (2015-2020)8.1.4 Wave Computing Product Description8.1.5 Wave Computing Related Developments8.2 Graphcore8.2.1 Graphcore Corporation Information8.2.2 Graphcore Overview8.2.3 Graphcore Production Capacity and Supply, Price, Revenue and Gross Margin (2015-2020)8.2.4 Graphcore Product Description8.2.5 Graphcore Related Developments8.3 Google Inc8.3.1 Google Inc Corporation Information8.3.2 Google Inc Overview8.3.3 Google Inc Production Capacity and Supply, Price, Revenue and Gross Margin (2015-2020)8.3.4 Google Inc Product Description8.3.5 Google Inc Related Developments8.4 Intel Corporation8.4.1 Intel Corporation Corporation Information8.4.2 Intel Corporation Overview8.4.3 Intel Corporation Production Capacity and Supply, Price, Revenue and Gross Margin (2015-2020)8.4.4 Intel Corporation Product Description8.4.5 Intel Corporation Related Developments8.5 IBM Corporation8.5.1 IBM Corporation Corporation Information8.5.2 IBM Corporation Overview8.5.3 IBM Corporation Production Capacity and Supply, Price, Revenue and Gross Margin (2015-2020)8.5.4 IBM Corporation Product Description8.5.5 IBM Corporation Related Developments8.6 Nvidia Corporation8.6.1 Nvidia Corporation Corporation Information8.6.2 Nvidia Corporation Overview8.6.3 Nvidia Corporation Production Capacity and Supply, Price, Revenue and Gross Margin (2015-2020)8.6.4 Nvidia Corporation Product Description8.6.5 Nvidia Corporation Related Developments8.7 Qualcomm8.7.1 Qualcomm Corporation Information8.7.2 Qualcomm Overview8.7.3 Qualcomm Production Capacity and Supply, Price, Revenue and Gross Margin (2015-2020)8.7.4 Qualcomm Product Description8.7.5 Qualcomm Related Developments8.8 Taiwan Semiconductor Manufacturing8.8.1 Taiwan Semiconductor Manufacturing Corporation Information8.8.2 Taiwan Semiconductor Manufacturing Overview8.8.3 Taiwan Semiconductor Manufacturing Production Capacity and Supply, Price, Revenue and Gross Margin (2015-2020)8.8.4 Taiwan Semiconductor Manufacturing Product Description8.8.5 Taiwan Semiconductor Manufacturing Related Developments 9 Machine Learning Chips Production Forecast by Regions9.1 Global Top Machine Learning Chips Regions Forecast by Revenue (2021-2026)9.2 Global Top Machine Learning Chips Regions Forecast by Production (2021-2026)9.3 Key Machine Learning Chips Production Regions Forecast9.3.1 North America9.3.2 Europe9.3.3 China9.3.4 Japan9.3.5 South Korea 10 Machine Learning Chips Consumption Forecast by Region10.1 Global Machine Learning Chips Consumption Forecast by Region (2021-2026)10.2 North America Machine Learning Chips Consumption Forecast by Region (2021-2026)10.3 Europe Machine Learning Chips Consumption Forecast by Region (2021-2026)10.4 Asia Pacific Machine Learning Chips Consumption Forecast by Region (2021-2026)10.5 Latin America Machine Learning Chips Consumption Forecast by Region (2021-2026)10.6 Middle East and Africa Machine Learning Chips Consumption Forecast by Region (2021-2026) 11 Value Chain and Sales Channels Analysis11.1 Value Chain Analysis11.2 Sales Channels Analysis11.2.1 Machine Learning Chips Sales Channels11.2.2 Machine Learning Chips Distributors11.3 Machine Learning Chips Customers 12 Market Opportunities & Challenges, Risks and Influences Factors Analysis12.1 Machine Learning Chips Industry12.2 Market Trends12.3 Market Opportunities and Drivers12.4 Market Challenges12.5 Machine Learning Chips Market Risks/Restraints12.6 Porters Five Forces Analysis 13 Key Finding in The Global Machine Learning Chips Study 14 Appendix14.1 Research Methodology14.1.1 Methodology/Research Approach14.1.2 Data Source14.2 Author Details14.3 Disclaimer
QY Research established in 2007, focus on custom research, management consulting, IPO consulting, industry chain research, data base and seminar services. The company owned a large basic data base (such as National Bureau of statistics database, Customs import and export database, Industry Association Database etc), experts resources (included energy automotive chemical medical ICT consumer goods etc.
OpenAI, the machine learning nonprofit co-founded by Elon Musk, has released its first commercial product: a rentable version of a text generation tool the organisation once deemed too dangerous to release.
Dubbed simply the API, the new service lets businesses directly access the most powerful version of GPT-3, OpenAIs general purpose text generation AI.
The tool is already a more than capable writer. Feeding an earlier version of the opening line of George Orwells Nineteen Eighty-Four It was a bright cold day in April, and the clocks were striking thirteen the system recognises the vaguely futuristic tone and the novelistic style, and continues with: I was in my car on my way to a new job in Seattle. I put the gas in, put the key in, and then I let it run. I just imagined what the day would be like. A hundred years from now. In 2045, I was a teacher in some school in a poor part of rural China. I started with Chinese history and history of science.
Now, OpenAI wants to put the same power to more commercial uses such as coding and data entry. For instance, if, rather than Orwell, the prompt is a list of the names of six companies and the stock tickers and foundation dates of two of them, the system will finish it by filling in the missing details for the other companies.
It will mark the first commercial uses of a technology which stunned the industry in February 2019 when OpenAI first revealed its progress in teaching a computer to read and write. The group was so impressed by the capability of its new creation that it was initially wary of publishing the full version, warning that it could be misused for ends the nonprofit had not foreseen.
We need to perform experimentation to find out what they can and cant do, said Jack Clark, the groups head of policy, at the time. If you cant anticipate all the abilities of a model, you have to prod it to see what it can do. There are many more people than us who are better at thinking what it can do maliciously.
Now, that fear has lessened somewhat, with almost a year of GPT-2 being available to the public. Still, the company says: The fields pace of progress means that there are frequently surprising new applications of AI, both positive and negative.
We will terminate API access for obviously harmful use-cases, such as harassment, spam, radicalisation, or astroturfing [masking who is behind a message]. But we also know we cant anticipate all of the possible consequences of this technology, so we are launching today in a private beta [test version] rather than general availability.
OpenAI was founded with a $1bn (0.8bn) endowment in 2015, backed by Musk and others, to advance digital intelligence in the way that is most likely to benefit humanity. Musk has since left the board, but remains as a donor.
We have heard of CPUs and TPUs, now, NVIDIA with the help of its recent acquisition Mellanox is bringing a new class of processors to power up deep learning applications DPUs or data processing units.
DPUs or Data Processing Units, originally popularised by Mellanox, now wear a new look with NVIDIA; Mellanox was acquired by NVIDIA earlier this year. DPUs are a new class of programmable processor that consists of flexible and programmable acceleration engines which improve applications performance for AI and machine learning, security, telecommunications, storage, among others.
The team at Mellanox has already deployed the first generation of BlueField DPUs in leading high-performance computing, deep learning, and cloud data centres to provide new levels of performance, scale, and efficiency with improved operational agility.
The improvement in performance is due to the presence of high-performance, software programmable, multi-core CPU and a network interface capable of parsing, processing, and efficiently transferring data at line rate to GPUs and CPUs.
According to NVIDIA, a DPU can be used as a stand-alone embedded processor. DPUs are usually incorporated into a SmartNIC, a network interface controller. SmartNICs are ideally suited for high-traffic web servers.
A DPU based SmartNIC is a network interface card that offloads processing tasks that the system CPU would normally handle. Using its own on-board processor, the DPU based SmartNIC may be able to perform any combination of encryption/decryption, firewall, TCP/IP and HTTP processing.
The CPU is for general-purpose computing, the GPU is for accelerated computing and the DPU, which moves data around the data centre, does data processing.
These DPUs are known by the name of BlueField that have a unique design that can enable programmability to run at speeds of up to 200Gb/s. The BlueField DPU integrates the NVIDIA Mellanox Connect best-in-class network adapter, encompassing hardware accelerators with advanced software programmability to deliver diverse software-defined solutions.
Organisations that rely on cloud-based solutions, especially can benefit immensely from DPUs. Here are few such instances, where DPUs flourish:
Bare metal environment is a network where a virtual machine is installed
The shift towards microservices architecture has completely transformed the way enterprises ship applications at scale. Applications that are based on the cloud have a lot of activity or data generation, even for processing a single application request. According to Mellanox, one key application of DPU is securing the cloud-native workloads.
For instance, Kubernetes security is an immense challenge comprising many highly interrelated parts. The data intensity makes it hard to implement zero-trust security solutions, and this creates challenges for the security team to protect customers data and privacy.
As of late last year, the team at Mellanox stated that they are actively researching into various platforms and integrating schemes to leverage the cutting-edge acceleration engines in the DPU-based SmartNICs for securing cloud-native workloads at 100Gb/s.
According to NVIDIA, a DPU comes with the following features:
Know more about DPUs here.
Jun 11th 2020
THE FUNDAMENTAL assumption of the computing industry is that number-crunching gets cheaper all the time. Moores law, the industrys master metronome, predicts that the number of components that can be squeezed onto a microchip of a given sizeand thus, loosely, the amount of computational power available at a given costdoubles every two years.
For many comparatively simple AI applications, that means that the cost of training a computer is falling, says Christopher Manning, the director of Stanford Universitys AI Lab. But that is not true everywhere. A combination of ballooning complexity and competition means costs at the cutting edge are rising sharply.
Dr Manning gives the example of BERT, an AI language model built by Google in 2018 and used in the firms search engine. It had more than 350m internal parameters and a prodigious appetite for data. It was trained using 3.3bn words of text culled mostly from Wikipedia, an online encyclopedia. These days, says Dr Manning, Wikipedia is not such a large data-set. If you can train a system on 30bn words its going to perform better than one trained on 3bn. And more data means more computing power to crunch it all.
OpenAI, a research firm based in California, says demand for processing power took off in 2012, as excitement around machine learning was starting to build. It has accelerated sharply. By 2018, the computer power used to train big models had risen 300,000-fold, and was doubling every three and a half months (see chart). It should knowto train its own OpenAI Five system, designed to beat humans at Defense of the Ancients 2, a popular video game, it scaled machine learning to unprecedented levels, running thousands of chips non-stop for more than ten months.
Exact figures on how much this all costs are scarce. But a paper published in 2019 by researchers at the University of Massachusetts Amherst estimated that training one version of Transformer, another big language model, could cost as much as $3m. Jerome Pesenti, Facebooks head of AI, says that one round of training for the biggest models can cost millions of dollars in electricity consumption.
Facebook, which turned a profit of $18.5bn in 2019, can afford those bills. Those less flush with cash are feeling the pinch. Andreessen Horowitz, an influential American venture-capital firm, has pointed out that many AI startups rent their processing power from cloud-computing firms like Amazon and Microsoft. The resulting billssometimes 25% of revenue or moreare one reason, it says, that AI startups may make for less attractive investments than old-style software companies. In March Dr Mannings colleagues at Stanford, including Fei-Fei Li, an AI luminary, launched the National Research Cloud, a cloud-computing initiative to help American AI researchers keep up with spiralling bills.
The growing demand for computing power has fuelled a boom in chip design and specialised devices that can perform the calculations used in AI efficiently. The first wave of specialist chips were graphics processing units (GPUs), designed in the 1990s to boost video-game graphics. As luck would have it, GPUs are also fairly well-suited to the sort of mathematics found in AI.
Further specialisation is possible, and companies are piling in to provide it. In December, Intel, a giant chipmaker, bought Habana Labs, an Israeli firm, for $2bn. Graphcore, a British firm founded in 2016, was valued at $2bn in 2019. Incumbents such as Nvidia, the biggest GPU-maker, have reworked their designs to accommodate AI. Google has designed its own tensor-processing unit (TPU) chips in-house. Baidu, a Chinese tech giant, has done the same with its own Kunlun chips. Alfonso Marone at KPMG reckons the market for specialised AI chips is already worth around $10bn, and could reach $80bn by 2025.
Computer architectures need to follow the structure of the data theyre processing, says Nigel Toon, one of Graphcores co-founders. The most basic feature of AI workloads is that they are embarrassingly parallel, which means they can be cut into thousands of chunks which can all be worked on at the same time. Graphcores chips, for instance, have more than 1,200 individual number-crunching cores, and can be linked together to provide still more power. Cerebras, a Californian startup, has taken an extreme approach. Chips are usually made in batches, with dozens or hundreds etched onto standard silicon wafers 300mm in diameter. Each of Cerebrass chips takes up an entire wafer by itself. That lets the firm cram 400,000 cores onto each.
Other optimisations are important, too. Andrew Feldman, one of Cerebrass founders, points out that AI models spend a lot of their time multiplying numbers by zero. Since those calculations always yield zero, each one is unnecessary, and Cerebrass chips are designed to avoid performing them. Unlike many tasks, says Mr Toon at Graphcore, ultra-precise calculations are not needed in AI. That means chip designers can save energy by reducing the fidelity of the numbers their creations are juggling. (Exactly how fuzzy the calculations can get remains an open question.)
All that can add up to big gains. Mr Toon reckons that Graphcores current chips are anywhere between ten and 50 times more efficient than GPUs. They have already found their way into specialised computers sold by Dell, as well as into Azure, Microsofts cloud-computing service. Cerebras has delivered equipment to two big American government laboratories.
Moores law isnt possible any more
Such innovations will be increasingly important, for the AIfuelled explosion in demand for computer power comes just as Moores law is running out of steam. Shrinking chips is getting harder, and the benefits of doing so are not what they were. Last year Jensen Huang, Nvidias founder, opined bluntly that Moores law isnt possible any more.
Other researchers are therefore looking at more exotic ideas. One is quantum computing, which uses the counter-intuitive properties of quantum mechanics to provide big speed-ups for some sorts of computation. One way to think about machine learning is as an optimisation problem, in which a computer is trying to make trade-offs between millions of variables to arrive at a solution that minimises as many as possible. A quantum-computing technique called Grovers algorithm offers big potential speed-ups, says Krysta Svore, who leads the Quantum Architectures and Computation Group at Microsoft Research.
Another idea is to take inspiration from biology, which proves that current brute-force approaches are not the only way. Cerebrass chips consume around 15kW when running flat-out, enough to power dozens of houses (an equivalent number of GPUs consumes many times more). A human brain, by contrast, uses about 20W of energyabout a thousandth as muchand is in many ways cleverer than its silicon counterpart. Firms such as Intel and IBM are therefore investigating neuromorphic chips, which contain components designed to mimic more closely the electrical behaviour of the neurons that make up biological brains.
For now, though, all that is far off. Quantum computers are relatively well-understood in theory, but despite billions of dollars in funding from tech giants such as Google, Microsoft and IBM, actually building them remains an engineering challenge. Neuromorphic chips have been built with existing technologies, but their designers are hamstrung by the fact that neuroscientists still do not understand what exactly brains do, or how they do it.
That means that, for the foreseeable future, AI researchers will have to squeeze every drop of performance from existing computing technologies. Mr Toon is bullish, arguing that there are plenty of gains to be had from more specialised hardware and from tweaking existing software to run faster. To quantify the nascent fields progress, he offers an analogy with video games: Were past Pong, he says. Were maybe at Pac-Man by now. All those without millions to spend will be hoping he is right.
This article appeared in the Technology Quarterly section of the print edition under the headline "Machine, learning"