NHacker Next
login
▲NSF and Nvidia award Ai2 $152M to support building an open AI ecosystemallenai.org
98 points by _delirium 3 hours ago | 43 comments
Loading comments...
jeffreysmith 1 hours ago [-]
Not sure what's with the HN tone on this announcement. AI2 are really some of the best people around for creating truly open artifacts for the whole ecosystem. Their work on OLMo and Molmo is some of the most transparent and educational material you can find on model building. This is just great news for everyone.
thefaux 1 hours ago [-]
Personally, I suspect this technology is the worst development since nuclear weapons and, in some ways may be even worse, so I am deeply skeptical this is great news for anyone.
Difwif 43 minutes ago [-]
You're right. It's time to ban the evil numbers from being matrix multiplied. Contact your local representative about CPU control.
mattigames 15 minutes ago [-]
Very funny, the problem is more about the input than whatever the CPU is doing, as in chatGPT would be no more than a footnote without copyrighted material in all it's datasets.
artninja1988 7 minutes ago [-]
We all stand on the shoulders of giants:)
bigyabai 12 minutes ago [-]
Not really?
chvid 50 minutes ago [-]
Exactly why is that? Surely llms have use beyond pure destruction (unlike a nuclear weapon).
khalic 51 minutes ago [-]
You probably need to look under the hood, it has nothing to do with what popular culture called AI until very recently. It’s just a word generator on steroids, don’t believe the hype about AIs taking over, it’s complete BS.
winter_blue 5 minutes ago [-]
Fwiw, the difference it makes in software development speed is astounding.
hobofan 2 hours ago [-]
If Nvidia were interested in "open" AI, they would spend time to collaborate with AMD, etc. to build an (updated) open alternative to CUDA. That's probably the most closed part of the whole stack right now.
sounds 2 hours ago [-]
Nvidia is interested in commoditizing their complements. It's a business strategy to decrease the power of OpenAI (for instance).

Nvidia dreams of a world where there are lots of "open" alternatives to OpenAI, like there are lots of open game engines and lots open software in general. All buying closed Nvidia chips.

amelius 2 hours ago [-]
But AI depends on a small number of tensor operators, primitives which can be relatively easily implemented by competitors, so compute is very close to being a commodity when it comes to AI.

A company like Cerebras (founded in 2015) proves that this is true.

The moat is not in computer architecture. I'd say the real moat is in semiconductor fabrication.

sounds 1 hours ago [-]
Have you ever tried to run a model from huggingface on an AMD GPU?

Semiconductor fabrication is a high risk business.

Nvidia invested heavily in CUDA and out-competed AMD (and Intel). They are working hard to keep their edge in developer mindshare, while chasing hardware profits at the same time.

amelius 41 minutes ago [-]
> Have you ever tried to run a model from huggingface on an AMD GPU?

No, but seeing how easily they run on Apple hardware, I don't understand your point, to be honest.

bilbo0s 1 hours ago [-]
which can be relatively easily implemented by competitors

Oh my.

Please people, try to think back to your engineering classes. Remember the project where you worked with a group to design a processor? I do. Worst semester of my life. (Screw whoever even came up with that damn real analysis math class.) And here's the kicker, I know I'll be dating myself here, but all I had to do for my part was tape it out. Still sucked.

Not sure I'd call the necessary processor design work here "relatively easy"? Even for highly experienced, extremely bright people, this is not "relatively easy".

Far more easy to make the software a commodity. Believe me.

amelius 39 minutes ago [-]
To be totally honest, the only thing I can distill from this is that perhaps you should have picked an education in CS instead of EE.

I mean this is like saying that a class for building compilers sucked. Still, companies make compilers, and they aren't all >$1B companies. In fact, hobbyists make compilers.

PeterStuer 1 hours ago [-]
I thought they assumed AI hardware would become commoditized sooner rather than later, and their play was to sell complete vertically integrated AI solution stacks, mainly a software and services play?
arthurcolle 2 hours ago [-]
Why is OpenAI a threat to Nvidia? They are still highly dependent on those GPUs
tomrod 2 hours ago [-]
Two concepts

- Monopsony is the inverse of Monopoly -- one buyer. Walmart is often a monopsony for suppliers (exclusive or near exclusive).

- Desire for vertical integration and value extraction, related to #1 but with some additional nuances

next_xibalba 2 hours ago [-]
Who is the one buyer in the Nvidia scenario? How would that benefit Nvidia?
KaoruAoiShiho 2 hours ago [-]
It would hurt nvidia not benefit, that's why nvidia spends a lot of effort to prevent that from happening, and it's not the case currently.

They really need to avoid the situation in the console market, where the fact there's only 3 customers means almost no margins on console chips.

next_xibalba 28 minutes ago [-]
Prior to the A.I. boom, Nvidia had a much, much more diverse customer base in terms of revenue mix. According to their 2015 annual report[1], their revenues were spread across the following revenue segments: gaming, automotive, enterprise, HPC and cloud, and PC and mobile OEMs. Gaming was the largest segment and contributed less than 50% of revenues. At this time, with a diverse customer base, their gross margins were 55.5%. (This is a fantastic gross margin in any industry outside software).

In 2025 (fiscal year), Nvidia only reported two revenue segments: compute and networking ($116B revenue) and graphics ($14.3B revenue). Within the compute and networking segment, three customers represented 34% of all revenue. Nvidia's gross margins for fiscal 2025 were 75% [2].

In other words, this hypothesis doesn't fit at all. In this case, having more concentration in extremely deep pocketed customers competing over a constrained supply of product has caused margins to sky rocket. Moreover, GP's claim of monopsony doesn't make any sense. Nvidia is not at any risk of having a single buyer, and with the recent news that sales to China will be allowed, the customer base is going to become more diverse, creating even more demand for their products.

[1] https://s201.q4cdn.com/141608511/files/doc_financials/annual...

[2] https://s201.q4cdn.com/141608511/files/doc_financials/2025/a...

grim_io 2 hours ago [-]
Google shows that Nvidia is not necessary. How long until more follow?
NitpickLawyer 2 hours ago [-]
Tbf, goog started a long time ago with their TPUs. And they've had some bumps along the way. It's not as easy as one might think. There are certainly efforts to produce alternatives, but it's not an easy task. Even the ASIC-like providers like cerberas and groq are having problems with large models. They seemed very promising with SLMs, but once MoEs became a thing they started to struggle.
arthurcolle 1 hours ago [-]
I agree in principle but you can't just yolo fab TPUs and leapfrog google
vlovich123 2 hours ago [-]
If OpenAI becomes the only buyer, they can push around Nvidia and invest in alternatives to blunt their power. If OpenAI is one of many customers, then they’re not a strong bargaining position and Nvidia gets to set the terms.
patates 2 hours ago [-]
Maybe if they grow too much they'd develop their own chips. Also if one company wins, as in they wipe out the competition, they'd have much less incentive to train more and more advanced models.
victorbjorklund 1 hours ago [-]
One large customer has more bargin power than many big ones. And risk is OpenAI would try to make their own chips if they capture all the market.
someone7x 2 hours ago [-]
> commoditizing their complements

Feels like a modern euphemism for “subjugate their neighbors”.

skybrian 1 hours ago [-]
No, it’s encouraging competition and cost-cutting in a part of the market they don’t control. This can be a reason for companies to support open source, for example.

Meanwhile, the companies running data centers will look for ways to support alternatives to Nvidia. That’s how they keep costs down.

It’s a good way to play companies off each other, when it works.

1 hours ago [-]
jvanderbot 2 hours ago [-]
Business has always been a civilized version of war, and one which will always capture us in similar ways, so I guess wartime analogies are appropriate?

Still it feels awful black and white to phrase it that way when this is a clear net good and better alignment of incentives than before.

kookamamie 2 hours ago [-]
Indeed. This is throwing pennies in virtue-signaling openness.
sim7c00 1 hours ago [-]
you are not wrong. open up cuda would be a a real power move. i think ppl would mind a some of their other crap practices a lot less.
colechristensen 49 minutes ago [-]
They could also publish all of their source code, die designs, put their patents in the public domain and go live on a beach somewhere and fish for a living.

CUDA is what they sell, it makes more sense for them to charge for hardware and give the hardware-locked software away for free.

bongodongobob 2 hours ago [-]
That's AMDs fault, not Nvidia's.
khalic 1 hours ago [-]
People seem to be missing the fact that Ai2 is an initiative by the Allen Institute for AI, not a company
NitpickLawyer 43 minutes ago [-]
And they've already released open source models, with data and training code. They're definitely the good guys here.
datadrivenangel 2 hours ago [-]
Suggest changing the title to:

NSF and NVIDIA award Ai2 $152M to support building a fully open AI ecosystem

To better indicate that this is not related to OpenAI and that the group intends to release everything needed to train their models.

brunohaid 2 hours ago [-]
Maybe that'll help them hire someone who can at least respond to S2 API key requests...

Being open is great, but if over the course of 6 months 3 different entities (including 2 well known universities) apply and send more than a dozen follow ups to 3 different "Reach out to us!" emails with exactly 0 response, the "open" starts sounding like it's coming from Facebook.

pmdr 1 hours ago [-]
So basically nvidia handing out cash to itself. <insert Obama medal meme>
zoobab 2 hours ago [-]
"Open" like an open source FPGA implementation of their chips?
cruffle_duffle 51 minutes ago [-]
I can’t wait until I can run this shit locally without spending $10,000 on clusters of GPU’s. The models will be trained using some distributed P2P-like technology that “The Man” can’t stop.

Imagine running a model trained by the degenerates on 4chan. Models that encourage you to cheat on your homework, happily crank out photos of world leaders banging farm animals, and gleefully generate the worst gore imaginable. And best of all, any junior high schooler on the planet can do it.

That’s how you know this technology has arrived. None of this sanitized, lawyer approved, safety weenie certified, three letter agency authorized nonsense we have now. Until it’s all local, it’s just an extension of the existing regime.

philipkglass 23 minutes ago [-]
Imagine running a model trained by the degenerates on 4chan.

It has been done before:

https://en.wikipedia.org/wiki/GPT4-Chan

https://archive.org/details/gpt4chan_model_float16

jeffWrld 42 minutes ago [-]
[dead]