Amazon to invest up to $4 billion in leading edge tech Anthropic

Tech AI led investment

E-commerce conglomerate Amazon announced on Monday 25th September 2023 that it will invest up to $4 billion in artificial intelligence (AI) firm Anthropic, a rival to ChatGPT developer OpenAI, and take a minority ownership position in the company.

The move further enforces Amazon’s aggressive AI push as it aims to keep pace with rivals such as Microsoft and Alphabet’s Google.

The two firms reportedly said that they are forming a strategic collaboration to advance generative AI, with the startup selecting Amazon Web Services as its primary cloud provider.

Money waiting to go into tech, turn it on

Tech money

Reports suggest as much as $3 trillion is waiting on the sidelines to be invested in tech’.

AI FOMO

The reasoning is that AI is driving a fear of missing out (FOMO). We could very well be experiencing the fourth industrial revolution right now, and it is AI-driven. Strategically, companies can’t just sit around and wait. There’s a window where if they don’t join in or realise the potential and grab the opportunity, they’ll miss out.

IPO’s

Three of the biggest initial public offerings (IPO) in the tech’ sector in nearly two years raised some $6 billion collectively in less than a week. Nvidia has attracted much attention with the AI driven interest it has created recently.

While a handful of tech IPOs and one big acquisition wouldn’t have been much cause for celebration in previous years, they are a welcome return after the drought of pandemic-era hit investment.

The IPO market for tech was effectively shut down until Arm Holdings, Instacart and Klaviyo opened the investors door again. Merger activity such as that driven by Microsoft Corp., OpenAI ChatGPT and Activision Blizzard Inc. is helping to lift up the appetitie for investment again. And it’s pretty much AI induced.

Money ready to go

Some analysts suggest there is $3 trillion sitting on the sidelines ready to invest, mostly held by Big Tech and private equity companies. The fascination with artificial intelligence (AI) and fear of missing out (FOMO) will create massive AI led tech investing opportunities. Everyone will want a slice of this cake.

This could very well be the biggest transformational spending wave that we’ve seen in years and certainly since the internet arrived in 1995.

Just look out for that ‘bubble’ again – it will pop! But much money will be made before that happens and then again after.

The Magnificent Seven Tech Stocks – STOCK WATCH

The Magnificent Seven

Top tech stocks

The Magnificent Seven is a term to describe seven tech’ stocks that have been surging in 2023.

  • Meta Platforms (formerly Facebook), the social media giant that also owns Instagram, WhatsApp, and Oculus.
  • Apple, the maker of the iPhone, iPad, Mac, Apple Watch, AirPods, and other popular devices and services including cloud and Apple TV streaming service.
  • Amazon, the e-commerce leader that also operates AWS, Prime Video, Alexa, and Whole Foods.
  • Alphabet, the parent company of Google, YouTube, Gmail, Google Cloud, and Waymo.
  • Microsoft, the software company that owns Windows, Office, Azure, LinkedIn, Xbox, and Teams.
  • Nvidia, the semiconductor company that produces graphics cards, gaming devices, data center solutions, and AI platforms.
  • Tesla, the electric vehicle maker that also develops solar panels, batteries, and autonomous driving technology.

Dominant

These seven stocks are considered to be dominant in their respective fields and have strong growth prospects driven by innovation and artificial intelligence (AI).

They have outperformed the broader market and attracted many investors who are looking for exposure to the tech’ sector. Some analysts believe that these stocks will continue to lead the market in the future, while others caution that they may face regulatory challenges, competition, or valuation issues.

Approximate combined market cap of the Magnificent Seven tech stocks

The approximate combined market cap value of the Magnificent Seven as of September 2023 is approximately $11.8 trillion.

  • Apple: $2.5 trillion
  • Microsoft: $2.3 trillion
  • Alphabet: $1.9 trillion
  • Amazon: $1.7 trillion
  • Nvidia: $0.8 trillion
  • Meta Platforms: $0.9 trillion
  • Tesla: $0.7 trillion

Note that these values will change over time as the stock prices fluctuate.

A way to trade the tech sector is through funds

There are many funds that can trade tech stocks, depending on your investment objectives, risk tolerance, and preferences.

Technology mutual funds: These are funds that invest in a diversified portfolio of technology companies across different industries, such as software, hardware, internet, cloud, biotech, and more. Technology mutual funds can offer exposure to the growth potential of the tech sector, as well as reduce the volatility and risk of investing in individual stocks. 

Some examples of technology mutual funds are Fidelity Select Technology Portfolio (FSELX), Columbia Global Technology Growth Fund (CGTYX), and Schwab U.S. Large-Cap Growth Index Fund (SCHG).

Which tech fund to invest in?

Technology exchange-traded funds (ETFs): These are funds that track an index of technology stocks and trade on an exchange like a stock. Technology ETFs can offer low-cost and convenient access to the tech sector, as well as allow investors to choose from different themes, such as cybersecurity, artificial intelligence (AI), cloud computing and more. 

Some examples of technology ETFs are Invesco QQQ Trust (QQQ), Technology Select Sector SPDR Fund (XLK), and VanEck Vectors Semiconductor ETF (SMH).

Technology index funds: These are funds that replicate the performance of a specific technology index, such as the Nasdaq 100, the S&P 500 Information Technology Index, or the Morningstar U.S. Technology Index. Technology index funds can offer broad and passive exposure to the tech sector, as well as low fees and high tax efficiency.

Some examples of technology index funds are Fidelity NASDAQ Composite Index Fund (FNCMX), Vanguard Information Technology Index Fund Admiral Shares (VITAX), and iShares Morningstar U.S. Technology ETF (IYW).

NOTE: These are not recommendations. Investments may go up or down. Your money is at risk!

Always do your own research…

REASEARCH! REASEARCH! RESEARCH!

Baidu launches raft of AI applications after its Ernie chatbot receives massive public approval

AI chatbot

More than 6 million users already

Baidu also announced that more than 6 million users have used an AI powered tool that sits inside its Google drive-like cloud product.

At the 4th September event, Baidu also demonstrated ‘displayed generative’ AI-based products that could assist with traffic management, financial research and coal mine logistics.

ChatGPT, from Microsoft-backed OpenAI, is not officially available in China, where Google and Facebook are blocked.

10 new AI products announced by Baidu

Chinese tech giant Baidu announced more than 10 new AI-based applications on 4th September 2023, just days after its ChatGPT-like Ernie bot was released for public use.

Among the products revealed was a generative AI-integrated word processing app called WPS AI, created by Shanghai-listed Kingsoft Office. It was reported the company built the tool using the AI model on which Baidu’s Ernie bot is based, as well as Baidu’s ‘Qianfan’ cloud platform for AI models.

‘This AI malarchy is progressing at quite a rate’.

Nearly 10,000 businesses are actively using Baidu’s Qianfan cloud platform each month, the company claimed.

AI assistant

Baidu also announced that more than 6 million users have used an AI-powered tool that sits inside its Google drive-like cloud product. The AI assistant can search documents, summarize and translate text and create content, the company claimed.

It wasn’t immediately clear to what extent those products were available for public use.

On 31st August 2023, Baidu released its Ernie bot to the public, signaling government approval of the AI-powered chatbot. Other Chinese companies also released similar AI products around the same time.

Nvidia’s stock at record high after Google AI deal

AI microchip

Nvidia shares rose 4.2% Tuesday 29th August 2023 to close at a record high, after the company announced a partnership with Google that could expand distribution of its artificial intelligence technology (AI).

The stock’s bountiful run continued, now up 234% in 2023, making it by far the best performer in the S&P 500. Facebook parent Meta is second in the index, up 148% so far this year.

The record close comes less than a week after the company said quarterly revenue doubled from a year earlier and gave a forecast indicating that sales this period could rise 170% on an annual basis. The day after the better-than-expected earnings report, the stock climbed to a record intraday high of $502.66 before declining later in the afternoon.

Nvidia’s business is booming because its graphics processing (GPU’s) are being gobbled up by cloud companies, government agencies and startups to train and deploy generative AI models like the technology deployed in OpenAI’s ChatGPT as fasta as Nvidia can make them.

NVIDIA stock chart

Nvidia announcment

On Tuesday 29th August 2023, Nvidia CEO Jensen Huang appeared at a Google conference to announce an AI agreement between the two companies.

Through the partnership, Google’s cloud customers will have greater access to technology powered by Nvidia’s powerful H100 GPUs.

‘Our expanded collaboration with Google Cloud will help developers accelerate their work with infrastructure, software and services that supercharge energy efficiency and reduce costs’, the Nvidia CEO reportedly said in a blog post.

Nvidia’s GPUs are also available on competing cloud platforms from Amazon and Microsoft.

A brief history of ARM

Arm micro hip

Brief ARM history

Arm is a British semiconductor and software design company that is known for its Arm processors, which are widely used in smartphones, tablets, laptops, and other devices. Arm was founded in 1990 as a joint venture between Acorn Computers, Apple Computer, and VLSI Technology. The company was originally called Advanced RISC Machines, but later changed its name to Arm Ltd in 1998.

In 1985, the first Arm silicon chip was created by Acorn engineers Sophie Wilson and Steve Furber, who designed a 32-bit processor with a simple and elegant instruction set.

In 1990, Arm was spun off from Acorn as a separate company, with Apple as a major investor. Arm’s first product was the ARM6 processor, which was used in Apple’s Newton personal digital assistant.

Impression of the Apple Newton PDA device

In 1993, Arm introduced the ARM7 processor, which became one of the most successful embedded processors in history. It was used in devices such as the Nokia 6110 mobile phone, the Nintendo Game Boy Advance, and the Lego Mindstorms robotics kit.

In 1994, Arm launched the ARM9 processor family, which offered higher performance and lower power consumption than previous generations. The ARM9 was used in devices such as the Sony PlayStation Portable, the Palm Treo smartphone, and the Amazon Kindle e-reader.

In 1997, Arm introduced the ARM10 processor family, which featured a superscalar architecture and a floating-point unit. The ARM10 was used in devices such as the Apple iPod, the Samsung Galaxy S smartphone, and the Raspberry Pi computer.

In 1998, Arm changed its name from Advanced RISC Machines to Arm Ltd, reflecting its global expansion and recognition.

In 1999, Arm launched the ARM11 processor family, which featured a vector floating-point unit and a TrustZone security extension. The ARM11 was used in devices such as the iPhone 3G, the Nintendo DS, and the Raspberry Pi Zero.

In 2000, Arm became a public company, listing on the London Stock Exchange and the Nasdaq. The company raised £213 million in its initial public offering.

In 2001, Arm introduced the Cortex processor family, which offered a range of performance, power, and cost options for different applications. The Cortex processors are used in devices such as the Samsung Galaxy S10, the Apple Watch, and the Tesla Model 3.

In 2005, Arm acquired Artisan Components, a provider of physical intellectual property (IP) for chip design. This enabled Arm to offer a complete solution for system-on-chip (SoC) development.

In 2006, Arm announced the Mali graphics processing unit (GPU) family, which complemented its CPU offerings with high-performance graphics capabilities. The Mali GPUs are used in devices such as the Huawei Mate 20 Pro, the Oculus Quest, and the Samsung Smart TV.

Artistic image of ARM chip

In 2009, Arm partnered with IBM, Samsung, Texas Instruments, and others to form the Linaro consortium, which aimed to improve the Linux software ecosystem for Arm-based devices.

In 2010, Arm unveiled the Cortex-A15 processor, which was the first Arm processor to support virtualization and big.LITTLE technology. The Cortex-A15 was used in devices such as the Google Nexus 10, the LG G3, and the Nintendo Switch.

In 2011, Arm announced the Cortex-M0+ processor, which was the world’s most energy-efficient microcontroller. The Cortex-M0+ was used in devices such as the Arduino Nano 33 IoT, the Fitbit Flex 2, and the Nest Thermostat.

In 2012, Arm launched the Cortex-A53 and Cortex-A57 processors, which were the first Arm processors to support the 64-bit ARMv8 architecture. The Cortex-A53 and Cortex-A57 were used in devices such as the iPhone 6s, the Samsung Galaxy S6 Edge+, and the Microsoft Surface Pro X.

In 2013, Arm acquired Geomerics, a developer of real-time lighting technology for video games. This enhanced Arm’s graphics portfolio with dynamic illumination and global illumination effects.

In 2014, Arm introduced the Cortex-A72 processor, which delivered a 50% performance improvement over the previous generation. The Cortex-A72 was used in devices such as the Huawei P9, the Xiaomi Mi 5s Plus, and the Amazon Fire HD 10.

In 2015, Arm announced the Cortex-A35 processor, which was the most efficient Arm processor for smartphones and tablets. The Cortex-A35 was used in devices such as the Nokia 2.1, the Samsung Galaxy J2 Core, and the Lenovo Tab M7.

In 2016, Arm was acquired by SoftBank Group for £24.3 billion, becoming a subsidiary of the Japanese conglomerate. The deal was motivated by SoftBank’s vision of investing in technologies that would drive the future of artificial intelligence (AI), internet of things (IoT), and smart cities.

In 2017, Arm launched Project Trillium, a suite of machine learning (ML) solutions that included an ML processor , an object detection processor , and an open-source software framework. The Project Trillium products aimed to enable low-power and high-performance ML applications on edge devices.

In 2018, Arm unveiled the Cortex-A76 processor , which offered a 35% performance boost over its predecessor. The Cortex-A76 was used in devices such as the OnePlus 7T, the Huawei MateBook D14, and the Acer Chromebook Spin 13.

In 2019, Arm announced the Cortex-A77 processor , which improved on its predecessor with a higher clock speed, a larger cache, and better branch prediction . The Cortex-A77 was used in devices such as the Samsung Galaxy S20, the Asus ROG Phone II, and the Lenovo Yoga C940.

In 2020, Arm introduced the Cortex-X1 processor , which was its most powerful CPU design to date. The Cortex-X1 was designed to deliver peak performance for premium device , such as flagship smartphones, laptops and gaming consoles. The Cortex-X1 was used in devices such as the Samsung Galaxy S21 Ultra, the Xiaomi Mi 11, and the Google Pixel 6.

In 2021, Arm launched the Cortex-A78C processor , which was optimized for high-performance computing (HPC) applications. The Cortex-A78C featured up to eight CPU cores , a larger L3 cache, and support for ECC memory. The Cortex-A78C was used in devices such as the Samsung Galaxy Book Pro, the HP Elite Folio , and the Acer Chromebook Spin 513.

Microchip

In 2022, Arm unveiled the Cortex-A710 processor, which was its first big core to support the Armv9 architecture. The Cortex-A710 offered a 30% energy efficiency improvement over its predecessor, as well as enhanced security and ML features. The Cortex-A710 was used in devices such as the OnePlus 10 Pro, the Huawei MatePad Pro 2, and the Microsoft Surface Laptop Studio.

In 2023, Arm announced the Immortalis GPU family , which was its next-generation graphics solution that included hardware-based ray-tracing and variable rate shading capabilities . The Immortalis GPUs aimed to deliver realistic and immersive graphics for gaming, VR and AR applications on mobile devices . The Immortalis GPUs were used in devices such as the Samsung Galaxy S22 Ultra , the Sony Xperia 1 IV, and the Oculus Quest 3.

Powerful world presence

Arm is a leading semiconductor and software design company that has revolutionized the computing industry with its innovative and efficient processor architectures. Arm’s processors power billions of devices across various domains, such as mobile, IoT, AI, HPC, and gaming. Arm has been at the forefront of technological advancements for over three decades, delivering performance, energy efficiency, and security to its customers and partners.

Arm is a subsidiary of SoftBank Group and has a massive global presence.

ARM lists in U.S. and not UK

ARM IPO

British microchip designing giant Arm has announced that it has filed paperwork to sell its shares in the U.S. 

The Cambridge-based company, which designs chips for devices from smartphones to game consoles, plans to list on New York’s Nasdaq in September. The highly anticipated IPO in the U.S. comes after UK Prime Minister, failed to convince Arm to float in London or pursue a dual UK-U.S. listing. 

Arm’s decision to list in New York rather than London has fuelled fears that the City is losing its competitiveness to Wall Street, where valuations are typically higher. SoftBank-owned chip designer Arm on 21st August 2023 disclosed a modest 1% fall in annual revenue as it made public the paperwork for a U.S. listing that is expected to be the year’s biggest initial public offering. The company is reportedly looking for a valuation of between $60bn (£47bn) to $70bn.

Arm was bought in 2016 by Japanese conglomerate Softbank in a deal worth £23.4bn. Prior to the takeover, it was listed in both London and New York for 18 years.

Companies that use ARM processors in their products

Some of the companies that use ARM processors include Apple, Qualcomm, Samsung, Broadcom, and Fujitsu. ARM technology is used in a wide range of devices, from smartphones to game consoles to supercomputers.

ARM

Arm is a British semiconductor and software design company that is known for its Arm processors, which are widely used in smartphones, tablets, laptops, and other devices. Arm was founded in 1990 as a joint venture between Acorn Computers, Apple Computer, and VLSI Technology. The company was originally called Advanced RISC Machines, but later changed its name to Arm Ltd in 1998.

Amazon – leading or competing?

The power of AI

Amazon is one of the leading companies in the field of artificial intelligence (AI) and has been developing its own custom chips to power its AI applications and services.

Amazon’s AI chips are designed to perform tasks such as natural language processing, computer vision, speech recognition, and machine learning inference and training.

AI chips created by Amazon

  • AZ2: This is a processor built into the Echo Show 15 smart display and powers artificial intelligence tasks like understanding your voice commands and figuring out who is issuing those commands. The AZ2 chip also enables features such as visual ID, which can recognize faces and display personalized information on the screen.
  • Inferentia: This is a high-performance chip that Amazon launched to deliver low-cost and high-throughput inference for deep learning applications. Inferentia powers Amazon Elastic Compute Cloud (EC2) Inf1 instances, which are optimized for running inference workloads on AWS. Inferentia also powers some of Amazon’s own services, such as Alexa, Rekognition, and SageMaker Neo.
  • Trainium: This is a chip that Amazon designed to provide high-performance and low-cost training for machine learning models. Trainium will power Amazon EC2 Inf2 instances, which are designed to train increasingly complex models, such as large language models and vision transformers. Trainium will also support scale-out distributed training with ultra-high-speed connectivity between accelerators.

Despite advancements is Amazon chasing to keep up?

Amazon is racing to catch up with Microsoft and Google in the field of generative AI, which is a branch of AI that can create new content or data from existing data. Generative AI can be used for applications such as natural language generation, image and video synthesis, text summarization, and personalization.

AI models from Amazon

  • Titan: This is a family of large language models (LLMs). Titan models can generate natural language texts for various domains and tasks, such as conversational agents, document summarization, product reviews, and more. Titan models are trained on a large and diverse corpus of text data from various sources, such as books, news articles, social media posts, and product descriptions.
Power of AI
Powerful chips for artificial intelligence (AI)
  • Bedrock: This is a service that Amazon created to help developers enhance their software using generative AI. Bedrock provides access to pre-trained Titan models and tools to customize them for specific use cases. Bedrock also allows developers to deploy their generative AI applications on AWS using Inferentia or Trainium chips.

Generative AI

Amazon’s CEO, Andy Jassy in the past said he thought of generative AI as having three macro layers: the compute, the models, and the applications. He said that Amazon is investing heavily in all three layers and that its custom chips are a key part of its strategy to provide high-performance and low-cost compute for generative AI. He also said that Amazon is not used to chasing markets but creating them, and that he believes Amazon has the best platform for generative AI in the world.

Inferentia and Trainium, offer AWS customers an alternative to training their large language models on Nvidia GPUs, which have been getting difficult and expensive to procure. 

‘The entire world would like more chips for doing generative AI, whether that’s GPUs or whether that’s Amazon’s own chips that we’re designing’, Amazon Web Services CEO Adam Selipsky is reported to have said. ‘I think that we’re in a better position than anybody else on Earth to supply the capacity that our customers collectively are going to want’.

Fast actors

Yet others have acted faster, and invested more, to capture business from the generative AI boom. When OpenAI launched ChatGPT in November 2022, Microsoft gained widespread attention for hosting the chatbot, and investing a reportedly whopping $13 billion in OpenAI. It was quick to add the generative AI models to its own products, incorporating them into Bing in February 2023. 

That same month, Google launched its own large language model, Bard, followed by a $300 million investment in OpenAI rival Anthropic. 

Chat Bot
AI Chat Bot robot

It wasn’t until April 2023 that Amazon announced its own family of large language models, called Titan, along with a service called Bedrock to help developers enhance software using generative AI.

Amazon is not used to chasing markets. Amazon is used to creating markets. And for the first time for some time, they find themselves on the back foot and working to play catch up.

And Meta?

Meta also recently released its own LLM, Llama 2. The open-source ChatGPT rival is now available for people to test on Microsoft’s Azure public cloud.

The AI battle continues…

Hackers to compete for $20 million prize

Hackers

The U.S. cyber hacker challenge is a new initiative launched by the Biden administration in August 2023 to use artificial intelligence (AI) to protect critical U.S. infrastructure from cybersecurity risks. 

The challenge will offer $20 million in prize money and includes collaboration from leading AI companies Anthropic, Google, Microsoft and OpenAI, who will make their technology available for the competition. The challenge was announced at the Black Hat USA hacking conference in Las Vegas.

The competition will consist of three stages

  • Qualifying event in the spring of 2024
  • Semifinal at DEF CON 2024
  • Final at DEF CON 2025 

The competitors will be asked to use AI to secure vital software and open source their systems so that their solutions can be used widely (does that create a risk in itsellf)? The top three teams will be eligible for additional prizes, including a top prize of $4 million for the team that best secures vital software.

The challenge aims to explore what’s possible when experts in cybersecurity and AI have access to a suite of cross-company resources. The U.S. government hopes that the promise of AI can help further secure critical U.S. systems and protect Americans from future cyber attacks!

Limitations and risks using AI for security

However, there are flaws and drawbacks of using AI for cybersecurity, both for the attackers and the defenders.

  • Lack of transparency and explainability: AI systems are often complex and opaque, making it difficult to understand how they make decisions or what factors influence their outputs. This can lead to trust issues, ethical dilemmas, and legal liabilities.
  • Overreliance on AI: AI systems are not infallible and may make mistakes or produce false positives or negatives. Relying too much on AI, without human oversight or verification can result in missed threats, erroneous actions, or unintended consequences.
  • Bias and discrimination: AI systems may inherit or amplify human biases or prejudices that are present in the data, algorithms, or design of the systems. This can result in unfair or discriminatory outcomes, such as excluding certain groups of people from access to services or opportunities, or targeting them for malicious attacks.
  • Vulnerability to attacks: AI systems may be susceptible to adversarial attacks, such as data poisoning, model stealing, evasion, or exploitation. These attacks can compromise the integrity, availability, or confidentiality of the systems, or manipulate them to produce malicious outputs.
  • High cost: Developing and maintaining AI systems for cybersecurity requires a lot of resources, such as computing power, memory, data, and skilled personnel. These resources may not be easily accessible or affordable for many organizations or individual.
AI and cybersecurity systems
‘Well, what do you think of AI and cybersecurity sharing resources’? ‘Ha! playing right into our hands’.

These are some of the flaws of using AI for cybersecurity, but they are not insurmountable. With proper research, regulation, education, and collaboration, AI can be a powerful ally in enhancing cybersecurity and protecting against cyber threats – that is until it takes over, but that will never happen… will it?

Google says people should use its search engine to check whether information provided by its Chatbot, Bard, is actually accurate

Robot AI

Accuracy

According to a recent news article, Google says people should use its search engine to check whether information provided by Bard is actually accurate, as it may display inaccurate or offensive information that doesn’t represent Google’s views. Just Google views I wonder…?

Google’s UK boss Debbie Weinstein said Bard was not really the place that you go to search for specific information, but rather an experiment best suited for collaboration around problem solving and creating new ideas.

Robot AI
‘Just checking the answer with my search engine!’

Hallucinate

According to an Android Authority article, both Bard and ChatGPT can hallucinate or confidently lie when asked about obscure topics. Bard does offer a link to search results and will sometimes cite a source or two. However, Google states that Bard can even lie about its own inner workings so you cannot trust everything it says…?

Testing… 1… 2… 3…?

According to a report by Marie Haynes, Bard predicts it will generate accurate responses 85% of the time by September 2023, but in an experiment, it posted an accuracy score of 63%, meaning it had incorrect information in more than 1/3 of its responses

Early days, or habouring a problem for the future?

AI race gathers momentum as China’s Baidu claims its Ernie Bot is Better than ChatGPT on key tests

AI Robots Chatting

Baidu said its AI system called Ernie 3.5 outperformed OpenAI’s ChatGPT and GPT4 in several key areas.

  • The Chat Bot was revealed in March 2023 and has since been publicly testing it in China. The chatbot is based on Baidu’s foundational AI model called ERNIE.
  • Baidu’s advancements underscore the intense competition taking place in the area of generative AI with technology giants in the US and China rapidly advancing their AI models.

 ERNIE Enhanced Language RepresentatioN with Informative Entities

US and China AI Bots go head to head

Ernie was first introduced in 2019, and since then, Baidu has been improving and upgrading it with new versions. The latest version, Ernie 3.5, was announced in June 2023, and it claims to outperform OpenAI’s ChatGPT and GPT 4 in several key areas

Baidu’s Ernie is an artificial intelligence (AI) model that powers the company’s chatbot service, Ernie Bot. Ernie stands for Enhanced Language RepresentatioN with Informative Entities, and it is a natural language processing (NLP) deep-learning model that can understand and generate natural language.

Trained on large data sets

Ernie 3.5 is based on Baidu’s foundational AI model, which is trained on huge amounts of data from various domains, such as news, social media, encyclopedias, books, and more. Ernie 3.5 can handle various NLP tasks, such as question answering, dialogue generation, text summarization, sentiment analysis, and more.

According to a test by the China Science Daily journal, Ernie 3.5 surpassed ChatGPT and GPT 4 in general abilities and outperformed the more advanced GPT 4 on several Chinese-language capabilities. 

ERNIE version 3.5 boosted its training and efficiency, making it faster and cheaper to upgrade to future versions. Baidu hopes that ERNIE Bot will become the next must-have app in China’s internet market, attracting users because of its natural and engaging conversations.

Intergration

Baidu has been integrating ERNIE Bot across multiple business applications, ranging from cloud computing to smart speakers. 

Chat Bot
AI Chatbot

ERNIE Bot is one of the examples of how Baidu is investing in AI technology and competing with other tech giants in the US and China. Baidu’s founder Robin Li, reportedly said that ‘foundation models are an engine driving global economic growth and represent a major strategic opportunity that cannot be missed‘.

The major BIG players, Alphabet (Google), Microsoft & META all have their own versions of AI. Hopefully it will be used ‘intelligently’.