U.S. introduces new microchip-related export controls

U.S. chip rules

The Biden administration is reportedly implementing new export controls on essential technologies, such as quantum computing and semiconductor materials, in response to China’s progress in the global chip market

These controls encompass quantum computers and their components, sophisticated chipmaking tools, semiconductor technologies, certain metal and metal alloy components and software, and high-bandwidth chips, which are vital for AI applications.

While the U.S. intensifies its measures to curb China’s expansion, there is noticeable hesitancy within the global industry.

The U.S. Department of Commerce issued new regulations on Friday, 6th September 2024, encompassing quantum computers and their components, sophisticated chipmaking tools, certain metal and metal alloy components and software, as well as high-bandwidth chips, which are vital for AI applications.

See report details here

EU passes world’s first major ‘act’ to regulate AI

The European Union (EU) has made history by approving the world’s first comprehensive regulatory framework for artificial intelligence (AI).

Artificial Intelligence Act

Known as the Artificial Intelligence Act, this groundbreaking legislation is expected to serve as a global signpost for other governments grappling with how to regulate this fast-developing technology.

The AI Act takes a risk-based approach, categorizing AI applications based on their risk levels. It prohibits certain high-risk uses, emphasizes transparency, and aims to keep AI development human centric. This landmark regulation should help set a precedent for responsible AI deployment worldwide.

The regulation is expected to become enforceable in May 2024, after passing final checks and receiving endorsement from the European Council.

Crypto firms introduce risk assessments and finance tests in response to strict new rules in UK

Cryptocurrency

New rules and risk assessments for UK Crypto traders

Coinbase and Gemini, for example, are among cryptocurrency exchanges that now require U.K. users to fill out risk assessments. These questionnaires are designed to test their financial knowledge.

The measures are a response to new rules in the UK. The rules require crypto companies to clearly inform users of the risks involved in trading cryptocurrencies. If a customer fails to successfully complete the requests, they will be prevented from trading with their crypto account.

Risk warning

Crypto.com, Coinbase, Gemini and other cryptocurrency exchanges are warning UK users that they’ll need to complete investment questionnaires. Thes are aimed at testing their financial knowledge before being allowed to trade.

The companies have told UK users they are required to complete a declaration about what type of investor they are. Traders are required to respond to a set of questions on financial services to permit use of their platforms.

Clients’ declaration

In the client’s declaration section, users are asked to select their investor profile. A trader is directed to inform the company of their financial status.

Questions such as: are you a high-net-worth customer earning above £100,000 per annum or with a net worth of more than £250,000? Or, are you a ‘restricted investor’ who won’t invest more than 10% of their assets. If clients do not complete the requests, they are prevented from trading crypto related products.

The financial questionnaires, require users to respond to numerous questions about the range of products available. They want the client to fully understand the potential volatility of crypto assets.

Strict rules to protect the retail trader

Since the UK passed the Financial Services and Markets Act, companies that offer crypto assets and certain types of digital currency, known as stablecoins, are now covered by UK law.

These are the same rules as those that govern traditional financial services and are aimed at protecting the retail trader.

EU agrees deal on AI regulation

AI rules

European Union officials have reached a provisional deal on the world’s first comprehensive laws to regulate the use of artificial intelligence (AI).

The EU agreed guidelines around AI in systems like ChatGPT and facial recognition.

The European Parliament will vote on the AI Act proposals early next year, but any legislation will not take effect until 2025 at the earliest. The U.S., UK and China are all rushing to publish their own guidelines.

Safeguards

The proposals include safeguards on the use of AI within the EU as well as limitations on its adoption into law.

European Commission President Ursula von der Leyen said the AI Act would help the development of technology that does not threaten people’s safety and rights. Consumers would have the right to launch complaints and fines could be imposed for violations.

Unique framework

In a social media post, she said it was a ‘unique legal framework for the development of AI you can trust’.

The European Parliament defines AI as software that can ‘for a given set of human-defined objectives, generate outputs such as content, predictions, recommendations or decisions influencing the environments they interact with.’

This is a significant step towards ensuring that AI development and deployment are aligned with ethical standards and respect for human rights.

Will the EU, UK, U.S., China and other countries AI rules conflict?