Официальный новостной канал криптобиржи OKX | www.okx.com на русском языке.
💬 Комьюнити: t.me/okx_russian
👨💻 Поддержка: [email protected]
АДМИН: @DaniiOKX
Маркетинг: @CoffeeTrends
Last updated 2 weeks, 1 day ago
Here in simple language about TON and crypto
Founder: @metasalience
contact : @deftalk_bot
Last updated 3 months, 2 weeks ago
Канал о TON и все что с ним связано:
1. Аналитика
2. Инсайды
3. Авторское мнение
Ведро для спама: @ton_telegrambot
Бот с курсами криптовалют: @TonometerBot
Чат: @chaTON_ru
Админ: @filimono
Last updated 2 weeks, 3 days ago
We're excited to introduce The AI Scientist, the first comprehensive system for fully automatic scientific discovery, enabling Foundation Models such as Large Language Models (LLMs) to perform research independently.
https://github.com/SakanaAI/AI-Scientist
Blog: https://sakana.ai/ai-scientist/
Paper: https://arxiv.org/abs/2408.06292
GitHub
GitHub - SakanaAI/AI-Scientist: The AI Scientist: Towards Fully Automated Open-Ended Scientific Discovery 🧑🔬
The AI Scientist: Towards Fully Automated Open-Ended Scientific Discovery ***🧑🔬*** - SakanaAI/AI-Scientist
An interesting opinion on hype/no hype.
*"I don't think that "AI" models [a] (by which I mean: large language models) are over-hyped.
Yes, it's true that any new technology will attract the grifters. And it is definitely true that many companies like to say they're "Using AI" in the same way they previously said they were powered by "The Blockchain". (As we've seen again, and again, and again, and again.) It's also the case we may be in a bubble. The internet was a bubble that burst in 2000, but the Internet applications we now have are what was previously the stuff of literal science fiction.
But the reason I think that the recent advances we've made aren't just hype is that, over the past year, I have spent at least a few hours every week interacting with various large language models, and have been consistently impressed by their ability to solve increasingly difficult tasks I give them. And as a result of this, I would say I'm at least 50% faster at writing code for both my research projects and my side projects as a result of these models.
Most of the people online I find who talk about LLM utility are either wildly optimistic, and claim all jobs will be automated within three years, or wildly pessimistic, and say they have contributed nothing and never will.
So in this post, I just want to try and ground the conversation. I'm not going to make any arguments about what the future holds. I just want to provide a list of 50 conversations that I (a programmer and research scientist studying machine learning) have had with different large language models to meaningfully improve my ability to perform research and help me work on random coding side projects."*
https://nicholas.carlini.com/writing/2024/how-i-use-ai.html
Carlini
How I Use "AI"
I don't think that AI models (by which I mean: large language models) are over-hyped. In this post I will list 50 ways I've used them.
Что-то происходит...
https://techcrunch.com/2024/08/05/openai-co-founder-leaves-for-anthropic/
TechCrunch
OpenAI co-founder Schulman leaves for Anthropic, Brockman takes extended leave
One of OpenAI's co-founders and a chief architect of ChatGPT has left the company for rival AI startup Anthropic.
Ну и чтобы продолжить с книгами и ростом количества страниц, вот ещё прекрасное
Algebra, Topology, Differential Calculus, and Optimization Theory For Computer Science and Machine Learning
https://www.cis.upenn.edu/~jean/math-deep.pdf
На 2196 страниц.
Приятных выходных!
Ну и давненько про новые LLM не писали, а сегодня как раз вышла Qwen2 от Alibaba Cloud
https://qwenlm.github.io/blog/qwen2/
5 моделей: Qwen2-0.5B, Qwen2-1.5B, Qwen2-7B, Qwen2-57B-A14B, Qwen2-72B, обычные и instruction-tuned. Контекст до 128k. На бенчмарках выглядит красиво и бьёт Llama 3.
В дополнение к английскому и китайскому поддерживает ещё 27 языков.
Лицензия Apache 2.0 для всех кроме самой большой, у той прежняя Qianwen License.
В будущем обещают мультимодальность с видео и аудио.
Qwen
Hello Qwen2
GITHUB HUGGING FACE MODELSCOPE DEMO DISCORD Introduction After months of efforts, we are pleased to announce the evolution from Qwen1.5 to Qwen2. This time, we bring to you: Pretrained and instruction-tuned models of 5 sizes, including Qwen2-0.5B, Qwen2-1.5B…
А вот и Grok подоспел! 314B parameter MoE model. Apache 2.0 license. https://x.ai/blog/grok-os https://github.com/xai-org/grok
Telegram
gonzo-обзоры ML статей
А вот и Grok подоспел! 314B parameter MoE model. Apache 2.0 license. https://x.ai/blog/grok-os https://github.com/xai-org/grok
Вдруг вы хотите посмотреть сегодня лекцию про сознание
https://royalsociety.org/science-events-and-lectures/2024/03/faraday-prize-lecture/
royalsociety.org
Consciousness in humans and in other things | Royal Society
Join us for the Michael Faraday Prize Lecture given by 2023 winner Professor Anil Seth.
Официальный новостной канал криптобиржи OKX | www.okx.com на русском языке.
💬 Комьюнити: t.me/okx_russian
👨💻 Поддержка: [email protected]
АДМИН: @DaniiOKX
Маркетинг: @CoffeeTrends
Last updated 2 weeks, 1 day ago
Here in simple language about TON and crypto
Founder: @metasalience
contact : @deftalk_bot
Last updated 3 months, 2 weeks ago
Канал о TON и все что с ним связано:
1. Аналитика
2. Инсайды
3. Авторское мнение
Ведро для спама: @ton_telegrambot
Бот с курсами криптовалют: @TonometerBot
Чат: @chaTON_ru
Админ: @filimono
Last updated 2 weeks, 3 days ago