Get Mystery Box with random crypto!

Artificial Intelligence

Logo of telegram channel artificial_intelligence_in — Artificial Intelligence A
Logo of telegram channel artificial_intelligence_in — Artificial Intelligence
Channel address: @artificial_intelligence_in
Categories: Technologies
Language: English
Subscribers: 67.79K
Description from channel

AI will not replace you but person using AI will🚀
I make Artificial Intelligence easy for everyone so you can start with zero effort.
🚀Artificial Intelligence
🚀Machine Learning
🚀Deep Learning
🚀Data Science
🚀Python R
🚀AR and VR
Dm @Aiindian

Ratings & Reviews

3.50

2 reviews

Reviews can be left only by registered users. All reviews are moderated by admins.

5 stars

0

4 stars

1

3 stars

1

2 stars

0

1 stars

0


The latest Messages 8

2023-08-05 21:23:08
Now you can read your X-Rays with Ai.

Google Health announced Multimodal medical AI.

This means AI tools can handle multiple types of content, from text and images to sound and video, making them more versatile for medical use.

They've introduced different approaches to enhance AI's capabilities;
- a combination of models and LLMs. This development could greatly benefit medical research, education, and patient care.

Learn more here: https://ai.googleblog.com/2023/08/multimodal-medical-ai.html?linkId=8927847&m=1
22.5K viewsedited  18:23
Open / Comment
2023-08-01 19:48:01 Nowadays, I don't code much by myself. I use Aider

Aider is GPT powered coding in your terminal.

Checkout: https://aider.chat/
Github: https://github.com/paul-gauthier/aider
26.7K viewsedited  16:48
Open / Comment
2023-07-25 19:52:50 Most Machine Learning articles on Medium are really very bad quality and repetitive. Titles are usually clickbaits. Most start with a story which is utter nonsense and totally not required. In some 5-10% content is useful but most are fully useless. Sorry if I hurt feelings. Agree
34.9K viewsedited  16:52
Open / Comment
2023-07-21 18:30:47 New member?

Do you want to learn Artificial Intelligence, Machine Learning, Deep learning and Data Science.

I am trying to put Worlds BEST Ai Content in this Telegram Channel.

So when I start working on next ML project then, I don't need to spend time about googling basic stuffs.

All cheat sheet are at one place

Mathematics for Ai

List of Online Notebook IDE for ml projects

Curated List of mahine learning Datasets

Top Ai Blogs

Top YouTube Channels

Ai Jobs ~ The Community driven activity on No profit No loss basis.

And bonus

All FAQs
Top Python resources


I am posting Daily updates, news, books and free material.

And cherry on cake is, I am making Artificial Intelligence and machine learning so easy that newbies can start with zero efforts.

Reach me @Aiindian
39.5K views15:30
Open / Comment
2023-07-18 19:52:09
LLaMA 2 just released!

Meta just released LLaMa 2, the new state-of-the-art open-source LLM

LLaMA 2 is the next iteration of LLaMA and comes with a commercial-friendly license.

LLaMA 2 comes in 3 different model sizes 7B, 13B, and 70B. The 7B & 13B are leveraging the same architecture as LLaMA 1 and are a 1-to-1 replacement for commercial use!

New and improvements to v1:

Trained on 2T Tokens
Commercial use allowed
Chat models for dialogue use cases
4096 default context window (can be increased)
7B, 13B & 70B parameter version
70B model adopted grouped-query attention (GQA)
Chat models can use tools & plugins
LLaMA 2-CHAT as good as OpenAI ChatGPT
Available on Hugging Face


Announcement: https://ai.meta.com/llama/
Paper: https://ai.meta.com/research/publications/llama-2-open-foundation-and-fine-tuned-chat-models/
Models: https://huggingface.co/models?other=llama-2

It's exciting to see Meta going fully open allowing commercial use The gap between closed-source is melting.
32.7K viewsedited  16:52
Open / Comment
2023-07-05 15:34:01 How you can train Large Language Models?

Large language models (LLMs) are gaining significant popularity due to their versatility in text generation, translation, and question-answering tasks. However, training these models can be resource-intensive and time-consuming. LLMs examples include 𝐆𝐏𝐓-3 and 𝐆𝐏𝐓-4 from 𝐎𝐩𝐞𝐧𝐀𝐈, 𝐋𝐋𝐚𝐌𝐀 from 𝐌𝐞𝐭𝐚, 𝐚𝐧𝐝 𝐏𝐚𝐋𝐌2 from 𝐆𝐨𝐨𝐠𝐥𝐞.

Several LLM training frameworks have emerged to address this challenge, offering solutions to streamline and enhance the training process. Here are some of the most popular frameworks that help you to train and tuning LLMs Models:

Deepspeed: An efficient deep learning optimization library that simplifies distributed training and inference, enabling easy and effective implementation.
Examples: https://www.deepspeed.ai/

Megatron-DeepSpeed: A DeepSpeed version of NVIDIA's Megatron-LM, offering additional support for MoE model training, Curriculum Learning, 3D Parallelism, and other advanced features.
Examples: https://huggingface.co/blog/bloom-megatron-deepspeed

FairScale: A PyTorch extension library designed for high-performance and large-scale training, empowering researchers and practitioners to train models more efficiently.
Example: https://fairscale.readthedocs.io/en/latest/tutorials/oss.html

Megatron-LM: A research-focused framework dedicated to training transformer models at scale, facilitating ongoing exploration in the field.
Examples:https://huggingface.co/blog/megatron-training

Colossal-AI: A platform that aims to make large AI models more accessible, faster, and cost-effective, contributing to democratizing AI advancements.
Examples: https://github.com/hpcaitech/ColossalAI/tree/main/examples

BMTrain: An efficient training framework tailored for big models, enabling smoother and more effective training processes.
Examples: https://github.com/OpenBMB/BMTrain

Mesh TensorFlow: A framework simplifying model parallelism, making it easier to leverage distributed computing resources for training large models.
Examples: https://github.com/tensorflow/mesh

Max text: A performant and scalable Jax LLM framework designed to simplify the training process while maintaining high performance.
Examples: https://github.com/EleutherAI/maxtext

Alpa: A system specifically developed for training and serving large-scale neural networks, offering comprehensive support for training requirements.
Examples: https://alpa.ai/opt

GPT-NeoX: An implementation of model parallel autoregressive transformers on GPUs, built on the DeepSpeed library, providing enhanced training capabilities.
Examples: https://blog.eleuther.ai/announcing-20b/

If you're interested in training LLMs, I encourage you to explore these frameworks. They can significantly simplify and optimize the training process, allowing you to achieve better results efficiently.
37.4K viewsedited  12:34
Open / Comment
2023-06-26 16:53:01 The best way to learn about AI is to build real projects with it.

When you build, you learn what’s valuable and what’s not.

Use GitHub, Kaggle, paperswithcode, YouTube tutorial, and build a strong foundation around it. Develop a solid skillset and always remember there is no alternative to practice and hard work.
45.2K viewsedited  13:53
Open / Comment
2023-06-24 15:20:26 VoiceAI is the hot area in Ai this week!

Meta released Voicebox and now Google has released AudioPaLM.

AudioPaLM model can Listen and Generate Voice & text. Combining listening and generating in a single model opens up more opportunities for the developers and builders.

Checkout demo:
https://google-research.github.io/seanet/audiopalm/examples/
45.1K viewsedited  12:20
Open / Comment
2023-06-23 13:22:01 Building an AI applications will be one of the most crucial skills for the next 20 years.

If I were starting today, I'd learn these:

• Python
• OpenAI API
• Langchain
44.0K views10:22
Open / Comment
2023-06-04 19:48:55 Google has created Generative AI learning path with 9 FREE courses!

Topics covered:
- Intro to LLMs
- Attention Mechanism
- Image Generation/Captioning
- Intro to Responsible AI

From the fundamentals of LLMs to creating & deploying generative AI solutions!

Introduction to Generative AI:

An introductory level micro-learning course aimed at explaining:

- What Generative AI is
- How it is used
- How it differs from traditional ML

Check this out
https://www.cloudskillsboost.google/course_templates/536

Introduction to Large Language Models:

The course explores:

- Fundamentals LLMs
- Their use cases
- Prompt engineering on LLMs

Check this out
https://www.cloudskillsboost.google/course_templates/539

Introduction to Responsible AI:

The course explains what responsible AI is, why it's important, and how Google implements responsible AI in their products.

Check this out
https://www.cloudskillsboost.google/course_templates/554

Introduction to Image Generation:

This course introduces diffusion models, a family of ML models that recently showed promise in the image generation space.

Check this out
https://www.cloudskillsboost.google/course_templates/541

Encoder-Decoder Architecture:

This course gives you a synopsis of the encoder-decoder architecture.

It's a powerful and prevalent machine learning architecture for sequence-to-sequence tasks.

Check this out
https://www.cloudskillsboost.google/course_templates/543

Attention Mechanism:

The course teaches you how attention works & how it revolutionised:

- machine translation
- text summarisation
- question answering

Check this out
https://www.cloudskillsboost.google/course_templates/537

Transformer Models and BERT Model:

This course introduces you to some of the most famous and effective transformer architectures!

Check this out
https://www.cloudskillsboost.google/course_templates/538

Create Image Captioning Models:

This course teaches you how to create an image captioning model by using deep learning.

Check this out
https://www.cloudskillsboost.google/course_templates/542

Introduction to Generative AI Studio:

This course introduces Generative AI Studio, a product on Vertex AI.

It teaches you to prototype and customize generative AI models so you can use their capabilities in your applications.

Check this out
https://www.cloudskillsboost.google/course_templates/552

Bonus:
https://cloud.google.com/blog/topics/training-certifications/new-google-cloud-generative-ai-training-resources
63.2K viewsedited  16:48
Open / Comment