Elon Musk's answer to ChatGPT is getting an update to make it better at math, coding and more. Musk's xAI has launched Grok-1.5 to early testers with "improved capabilities and reasoning" and the ...
Grok 4 is a huge leap from Grok 3, but how good is it compared to other models in the market, such as Gemini 2.5 Pro? We now have answers, thanks to new independent benchmarks. LMArena.ai, which is an ...
The new line of coding model may be faster but also less honest, as it fares worse there than the already problematic Grok 4. Katelyn is a writer with CNET covering artificial intelligence, including ...
The rise of vibe coding is based on the promise of services like GPT-5: that in the future, you won’t have to know how to program at all in order to “create” software — you’ll just need to know how to ...
Have you ever had a brilliant app idea but felt overwhelmed by the thought of coding it from scratch? Or maybe you’re a seasoned developer looking for a way to streamline repetitive tasks and focus on ...
Grok is coming for the coding-focused customers of its bigger rivals. Elon Musk's xAI dropped a new AI model this week, specifically for developers, calling it a "speedy and economical reasoning model ...
xAI hired contractors to help Grok climb a popular AI leaderboard with the goal of overtaking Anthropic. Training documents show xAI wanted to "beat Sonnet 3.7 Extended," Anthropic's coding rival. AI ...
Elon Musk-funded xAI is skipping Grok 3.5 and releasing Grok 4 after Independence Day in the United States, and it could be the best model from the company. Grok 3.5 was originally supposed to be a ...
Elon Musk’s xAI Holdings Corp. has released grok-code-fast-1, a dedicated agentic coding artificial intelligence model that is extremely speedy and designed to strike a “compelling balance between ...
The world of artificial intelligence is moving at an incredible pace, with breakthroughs emerging constantly. Now, attention turns to XAI's upcoming release: Grok 4. As the next iteration of their ...
Elon Musk’s xAI has open sourced the base code of Grok AI model, but without any training code. The company described it as the 314 billion parameter Mixture-of-Expert model on GitHub. In a blog post, ...
当前正在显示可能无法访问的结果。
隐藏无法访问的结果