Anthropic's stealth enterprise coup
Diffusion models challenge GPT as next-generation AI
Amazon unveils Alexa+ AI
Make tax time a breeze with FreshBooks
FreshBooks’ intuitive billing, payment, and accounting platform takes the stress out of bookkeeping, keeping you tax-ready year-round. Here's the thing: With FreshBooks, you can automate invoices and expenses, process payments seamlessly, and get real-time insights into your business finances. Plus, with features like payroll and accountant access, FreshBooks grows with you, so you can focus on what you do best. FreshBooks. Huh, that was easy.
Get FreshBooks 70% off for 4 months
***
Anthropic's stealth enterprise coup
While consumer attention has focused on the generative AI battles between OpenAI and Google, Anthropic has executed a disciplined enterprise strategy centered on coding — potentially the most valuable enterprise AI use case. The results are becoming increasingly clear: Claude is positioning itself as the LLM that matters most for businesses. The evidence? Anthropic’s Claude 3.7 Sonnet, released just two weeks ago, set new benchmark records for coding performance.
This coding bet is supported by Anthropic’s own Economic Index, which found that 37.2% of queries sent to Claude were in the “computer and mathematical” category, primarily covering software engineering tasks like code modification, debugging and network troubleshooting. Most notable is what Anthropic hasn’t done. Unlike competitors that rush to match each other feature-for-feature, the company hasn’t even bothered to integrate web search functionality into its app — a basic feature most users expect. This calculated omission signals that Anthropic isn’t competing for general consumers but is laser-focused on the enterprise market.
Anthropic appears to be marching to its own beat — at a time when competitors are distracted, rushing to cover both enterprise and consumer markets with feature parity. OpenAI’s lead is reinforced from its early consumer recognition and usage, and it’s stuck trying to serve both regular users and businesses with multiple models and functionality. Google is chasing this trend too, trying to have one of everything. Instead of chasing consumer market share, the company has prioritized enterprise features like GitHub integration, audit logs, customizable permissions and domain-specific security controls. Strategy: First enable developers(coders) to build powerful foundations, then expand access to the broader enterprise workforce, including up into the corporate suite.
While Anthropic has found its focus, competitors are pursuing different strategies with varying results. Microsoft maintains significant momentum through its GitHub Copilot, which has 1.3 million paid users and has been adopted by more than 77,000 organizations in roughly two years. However, even Microsoft has acknowledged Anthropic’s strength. In October, it allowed GitHub Copilot users to choose Anthropic’s models as an alternative to OpenAI. Google has made its own play by recently making its Code Assist free, but this move seems more defensive than strategic.
In fact, most enterprise companies of scale are explicitly multimodal, in that their AI workflows allow them to use whatever model is best for a given case. Intuit was an early example of a company that had bet on OpenAI as a default for its tax return applications, but then last year switched to Claude because it was superior in some cases. The pain of switching led Intuit to create an AI orchestration framework that allowed switching between models to be much more seamless. They use whatever model is best for the specific case, pulling in models with simple API calls. In some cases, an open-source model like Llama might work well, but in others — for example, in calculations where accuracy is important — Claude from Anthropic is the choice, Intuit’s Ho explained....
***
The gold standard of business news
Morning Brew is transforming the way working professionals consume business news.
They skip the jargon and lengthy stories, and instead serve up the news impacting your life and career with a hint of wit and humor. This way, you’ll actually enjoy reading the news—and the information sticks.
Best part? Morning Brew’s newsletter is completely free. Sign up in just 10 seconds and if you realize that you prefer long, dense, and boring business news—you can always go back to it.
***
Diffusion models challenge GPT as next-generation AI
Inception Labs, a startup founded by researchers from Stanford, recently released Mercury, a diffusion-based language model (dLLM) that refines entire phrases at once, rather than predicting words one by one. Unlike traditional large language models (LLMs), which use an autoregressive approach—generating one word at a time, based on the preceding text—diffusion models improve text iteratively, through refinement.
“dLLMs expand the possibility frontier,” Stefano Ermon, a Stanford University computer science professor and co-founder of Inception Labs, tells IBM Think. “Mercury provides unmatched speed and efficiency, and—by leveraging more test-time compute—dLLMs will also set the bar for quality and improve overall customer satisfaction for edge and enterprise applications.” IBM Research Engineer Benjamin Hoover sees the writing on the wall: “It’s just a matter of two or three years before most people start switching to using diffusion models,” he says. “When I saw Inception Labs’ model, I realized, ‘This is going to happen sooner rather than later.’”
Diffusion models don’t play by the same rules as traditional AI. Autoregressive models like GPT build sentences word by word, predicting one token at a time. If a model is generating the phrase “To whom it may concern,” it predicts “To,” then “whom,” then “it,” and so on—one step at a time. Diffusion models flip the script. Instead of piecing together text sequentially, they start with a rough, noisy version of an entire passage and refine it in multiple steps. Think of it like an artist sketching a rough outline before sharpening the details, rather than drawing each element in order. By considering the whole sentence at once, diffusion models can generate responses faster, often with more coherence and accuracy than traditional LLMs. Hoover points to Inception Labs’ Mercury as a prime example of how diffusion models are closing the gap. “That model proved diffusion could hold its own and is actually faster and more efficient than comparable autoregressive models.”
Diffusion models are often more efficient(5-10x) because they refine entire sequences in parallel rather than generating each word step by step like traditional LLMs, reducing computational overhead. “Our customers and early adopters are developing applications powered by dLLMs in areas including customer support, sales and gaming,” Ermon says. “They’re making their applications more responsive, more intelligent and cheaper.”….
***
Amazon unveils Alexa+
Alexa, the often-mocked smart voice assistant, has been given a new purpose — and a new business model — thanks to generative AI. Looming over the Alexa business was whether the company could ever use the hundreds of millions of Alexa devices out there in the world to drive meaningful revenue for Amazon. At first, it was thought people would “speak” their Amazon orders to Alexa — but too few did. Then it was hoped third-party developers would build apps for Alexa, as they do for smartphones, that could spur use cases that turned a profit that Amazon could share. That didn’t happen either — Alexa users had no good way of finding out these apps even existed. Amazon then set about sticking Alexa into anything it could. Presentations became a roll call of dead-on-arrival devices. There are, of course, a whole heap of Alexa-enabled smart home devices, but it is clear the company has finally succumbed to the reality that Alexa is best controlled by seeing as well as hearing.
To emphasize this further, Amazon is launching alexa.com — a web-browser based version that is a clear effort to encroach on ChatGPT, Perplexity, Anthropic’s Claude AI and others. For the first time, Alexa users can forward things like emails or housing agreements to the service so they can be queried in plain English later — “When is football practice?”; “Can I install a solar panel on my house?” Can it compete with those market leaders? Amazon is banking on the Alexa brand being a friendlier gateway to AI — more recognizable and trusted to the vast majority of consumers.
To get there, Amazon believes its integrations and partnerships with third-party services will set them apart. This stands a far greater chance of succeeding this time around. On Alexa+ — which Amazon will make free for Prime members; $19.99 a month otherwise — there are “tens of thousands” of integrations with external services like Uber, Grubhub and OpenTable. The screen will mean users will be capable of being prompted to use things they might not initially be aware of. Recent advancements in natural language processing will mean that any reasonable variation of “book me an Uber,” for example, will be understood and acted upon. After years of searching, these integrations look like the first real hint at a significant business model for Alexa….
Cheers! SBalley Team