

Meta just dropped TRIBE v2 an AI that can predict what your brain does when you see a cat video or hear a song.
Instead of strapping you into an fMRI machine for hours, this "digital twin" simulates your neural activity on a laptop. Wild, right?
But wait, there's more:
• Anthropic is quietly throttling Claude during work hours are you hitting walls faster now?
• A tiny 8B-parameter model from Ai2 is crushing every model at browsing the web. How?
I'm Alex. Welcome to L8R by Innov8.
Let's dive deep 🐰
In today's post:
• Meta's AI Predicts Your Brain Activity
• Claude Gets Slower During Peak Hours
• Molmo 8B Beats GPT-4 at Browsing
Meta Just Built a Digital Twin of Your Brain

Meta just dropped TRIBE v2, an open-source AI that perfectly predicts how your brain reacts to videos, sounds, and text.
Instead of strapping people into expensive fMRI machines, researchers can now simulate human brain activity right on a computer.
It is basically a digital twin for your mind.
🔍 Key Points:
TRIBE v2 uses a tag-team of Llama 3.2, Wav2Vec-Bert-2.0, and Video-JEPA-2 to map exactly how your brain processes information.
Meta trained this beast on over 1,000 hours of real fMRI brain scans from 720 people.
The AI spits out a 3D brain map with 70,000 data points, boasting a massive 70x resolution boost over older models.
🚨 Why This Matters:
Scientists can now run cheap digital brain experiments instead of booking million-dollar lab machines.
This proves Meta is serious about building AI that actually perceives the world exactly like humans do.
This is completely open-source for researchers, so expect universities to start making massive medical breakthroughs immediately.
💡 L8R's Take:
I think this is Meta’s smartest stealth play yet they are mapping the human brain while everyone else just fights over chatbots.
This is absolutely massive and not overhyped, because turning human biology into code is the ultimate AI endgame.
Mark my words: Meta will eventually use this exact brain-mapping tech to make their future AR glasses read your mind.
Claude Hits the Brakes on Power Users

Anthropic is pulling the plug on your long Claude chats during busy hours.
Starting this week, Free, Pro, and Max users will hit their limits way faster if they log on during the morning rush.
It is a harsh reality check on the global GPU shortage.
🔍 Key Points:
Anthropic changed its rules on March 23, 2026, making users burn through their 5-hour session limits much faster during peak times.
The new "danger zone" is weekdays from 5:00 a.m. to 11:00 a.m. PT.
About 7% of users, especially paid Pro subscribers, will suddenly hit walls they never saw before.
🚨 Why This Matters:
If you use Claude Code for heavy dev work over your morning coffee, your workflow is about to get wrecked.
It proves that flat-rate AI subscriptions are an illusion because your "unlimited" access actually shrinks when the servers get hot.
You need to schedule your massive background tasks and code generations for the evening or night.
💡 L8R's Take:
Punishing your paying Max users because you cannot afford enough GPUs is a terrible look for Anthropic.
This is not a small tweak; it is a massive red flag that the AI infrastructure boom is hitting a concrete wall right now.
Stop paying for subscriptions and learn to use the API directly if you want real control over your AI tools.
From Our Partners
Every headline satisfies an opinion. Except ours.
Remember when the news was about what happened, not how to feel about it? 1440's Daily Digest is bringing that back. Every morning, they sift through 100+ sources to deliver a concise, unbiased briefing — no pundits, no paywalls, no politics. Just the facts, all in five minutes. For free.
Tiny Open-Source Model Browses the Web Like a Human

AI2 dropped MolmoWeb, a tiny 8B model that navigates websites using only screenshots.
It completely ignores messy background code and just clicks what it sees.
🔍 Key Points:
The 8B model scored a massive 78.2% success rate on the WebVoyager test, beating out other open models like Fara-7B.
It relies purely on computer vision to click, type, and scroll, meaning it doesn't need HTML or hidden browser data to work.
AI2 open-sourced everything the weights, the massive MolmoWebMix dataset, and the training code right on Hugging Face.
🚨 Why This Matters:
You can now run a highly capable web-browsing agent locally on your own laptop for free.
This proves we don't need giant, locked-down corporate models to build powerful AI assistants.
Developers should grab these weights immediately to start building custom web scrapers and personal task runners.
💡 L8R's Take:
I absolutely love this release because open-source AI is finally beating the big tech companies at their own game.
Building an agent that only needs screenshots is genius because it makes the AI completely immune to website code changes.
Stop paying monthly fees for premium automation tools and start testing MolmoWeb locally right now.
🚀 Quick L8R Summary
Meta's Brain AI: Meta dropped TRIBE v2, an open-source model that predicts fMRI brain activity from images, sounds, and text your brain, simulated on a laptop
Claude Gets Slower: Anthropic is throttling Claude during peak hours (5-11 AM weekdays) because too many people are hammering their servers
Molmo Crushes Web Tasks: Ai2's Molmo 8B beats GPT-4o and Claude at browsing websites just 8B parameters, fully open-source
📩 Innathe L8R engane undarunnu 👇?
We read every reply - just reply to this email and let us know how we can improve !
Appo adutha L8R il kanaam bie…👋

