MiniMax open-sources self-evolving agent model
MiniMax has open-sourced MiniMax M2.7 and made the model weights publicly available on Hugging Face. The company says it is its most capable open-source model yet and that it participated in its own development cycle.
MarkTechPost ↗
Liquid AI unveils compact vision-language model
Liquid AI released LFM2.5-VL-450M, an updated 450M-parameter vision-language model with bounding box prediction, better instruction following, multilingual support, and function calling. The model is designed for fast inference on edge hardware, including devices like NVIDIA Jetson Orin and mini-PCs.
MarkTechPost ↗
TriAttention boosts LLM throughput with KV compression
Researchers from MIT, NVIDIA, and Zhejiang University introduced TriAttention, a KV cache compression method for long-context LLM reasoning. The approach is reported to match full attention while delivering about 2.5 times higher throughput.
MarkTechPost ↗
Altman responds to profile after home attack
Sam Altman posted a response after an apparent attack on his home and a New Yorker profile that questioned his trustworthiness. The piece highlights ongoing scrutiny around the OpenAI CEO rather than any new policy or product change.
TechCrunch ↗
Secure Local-First Agent Runtime Tutorial
This tutorial explains how to build a fully local OpenClaw agent runtime with strict gateway binding, authenticated model access, and controlled tool execution. It also shows how to add a structured custom skill for the agent to discover and use safely.
MarkTechPost ↗