摘要:Trump administration officials are exploring extra curbs on the sale of Nvidia chips to China, Bloomberg reported on Wednesday, ci
TMTPOST -- The Trump administration is reportedly considering further export controls on artificial intelligence (AI) chips from Nvidia Corporation in the face of a frenzy that Chinese upstart DeepSeek brought to Wall Street, Silicon Valley and Washington.
Credit:Shutterstock
Trump administration officials are exploring extra curbs on the sale of Nvidia chips to China, Bloomberg reported on Wednesday, citing people with knowledge of the matter. The sources stressed the related conversation is in very initial stage as the new team works through policy priorities.
It was reported the possible measure that officials focused on is to expand restrictions to cover Nvidia H20 chips, which can be used to run AI software and were designed to comply with existing U.S. restrictions on China imposed by the Biden administration. H20 chips, the scaled-down offering tailored for China, are Nvidia’s response to regulations introduced by the U.S. Department of Commerce's Bureau of Industry and Security (BIS) in October 2023. These regulations impose stricter export controls on semiconductor products, including Nvidia's high-performance AI chips.
The sources noted any a decision on any restrictions is likely a long ways off since the Trump administration is only beginning to staff up in relevant departments. An Nvidia spokesperson later said in a statement that the company gets ready to work with the government as it pursues its own approach to AI.
Nvidia shares fell as much as 6.3% Wednesday following the report and logged up to 4.5% of losses before settling nearly 0.8% higher on Thursday.
Two lawmakers Thursday called on the Trump administration to weigh export curbs on Nvidia’s H20 and chips of similar sophistication for national security, alleging DeepSeek has extensively leveraged H20 chips to launch its AI model released recently.
In their letter to National Security Advisor Michael Waltz, Chairman John Moolenaar, a Republican, and Ranking Member Raja Krishnamoorthi, a Democrat, of a House committees focusing on the Sino-U.S. competition, asked for the move as part of a review ordered by Trump to scrutinize the U.S. export control system in light of "developments involving strategic adversaries."
Axios reported Thursday that U.S. Congress has banned staff use of DeepSeek. "At this time, DeepSeek is under review by the CAO and is currently unauthorized for official House use," the report cited a notice to congressional offices. The notice said he House has taken security measures to restrict DeepSeek's functionality on all House-issued devices as "threat actors are already exploiting DeepSeek to deliver malicious software and infect devices."
These lawmakers’ calling and the Congress ban highlighted Washington’s concerns over risks of DeepSeek’s popular chatbot service application powered by its AI models. The highly-efficient models that were trained for a relatively small sum of money and less-advanced chips spurred speculation that even though U.S. curbs could succeed in hampering China’s ability to deploy high-quality chips in AI systems, they also accelerate the development of effective AI systems that do not rely on the highest-quality chips. DeepSeek’s rise is such an evidence that Chinese firms, even a startup founded less than a year ago , managed to turn resource restriction into innovation.
The mobile application for DeepSeek, jumped to the No.1 spot in app stores this weekend, dethroning OpenAI’s ChatGPT as the most downloaded free app in U.S. on Apple’s App Store. The soar of DeepSeek in app store follows its AI models went viral at U.S.social media platform X last weekend.
What stunned Silly Valley is that it took just $5.58 million for DeepSeek to train its V3 large language model (LLM). The startup claimed it used 2,048 Nvidia H800 chips, a downgraded version of Nvidia’s H100 chips designed to comply with U.S. export restrictions. DeepSeek only spent 2.6 million H800 hours for a model much better than Meta’s, while Meta could have trained DeepSeek-V3 at least 15 times using the company’s budget of the Llama 3 model family.
DeepSeek earlier this month released open-source DeepSeek-R1, its reasoning models that it claims performance comparable to leading offerings like OpenAI’s o1 at a fraction of the cost. Several third-party tests have found that DeepSeek actually outperforms OpenAI's latest model. R1 contains 671 billion parameters and its “distilled” versions range in size from 1.5 billion parameters to 70 billion parameters. The full R1 can be available through DeepSeek’s API at prices 90%-95% cheaper than o1.
来源:钛媒体