Wait a second.
Grok’s symbol is a lightning bolt?
(None of them are THAT much better than previous, even worse in some areas)
how about you grok deez nuts (it is 1 am please forgive me)
OK Tech bros, repeat after me:
“Thou shalt not make a machine in the likeness of a human mind.”
I remember how we would get into trouble for copying each other’s homework in high school.
Now we get in trouble for generating each other’s homework 🤷
I wonder if a good fine tuned model beats every general purpose LLM if you need it for a really specific purpose
Yes it does
Of course, and this is why the new hotness is a Mixture of Experts for one model that is effectively a bunch of experts arguing over the answer, or else on a different scale there’s the Combination of Agents where different specialized agents perform specialized tasks.
There is new project which they share fine-tuned modernbert on some task. Here is the org https://huggingface.co/adaptive-classifier
I thought DeepSeek’s selling point was efficiency?
It was beating near frontier models at tenth of a cost could be hosted by you or any cloud service for you and the open weights were not censored for China’s pr and could be jailbroken to say write code for shady stuff which any frontier model would refuse.
You dropped these: ,
No time for commas with how fast this tech is developing
AI as a tech product is advancing faster than any other tech I’ve ever seen, you mentioning DeepSeek’s revelations from like 8 months ago already feels like an eternity
The most surprising thing about this image is that Grok is on it, they started way behind the 8ball and have caught up
It’s flip flopping between being a full on Nazi and arguing against MAGA with facts and logic.
The RL is so good grok changed it’s personality by changing small part of it’s system prompt
Anakin: It’s aligned.
Padme: To be good?
It changed after Grok 3
yeah i hate musk but the fact grok was launched in nov 2023 years behind the competition and has caught up is shocking
They got the whole Twitter database. It’s kinda the same with Gemini. But somehow Meta isn’t catching up, maybe their llama 4 architecture isn’t that stable to train.
Or maybe Facebook data is even worse than Twitter?
Llama 3.3 was good, tho. For the multimodal, llama 4 also use llama3.2 approach where the image and text is made into single model instead using CLIP or siglip.