Get Mystery Box with random crypto!

Artificial Intelligence

Logo of telegram channel artificial_intelligence_in — Artificial Intelligence A
Logo of telegram channel artificial_intelligence_in — Artificial Intelligence
Channel address: @artificial_intelligence_in
Categories: Technologies
Language: English
Subscribers: 67.79K
Description from channel

AI will not replace you but person using AI will🚀
I make Artificial Intelligence easy for everyone so you can start with zero effort.
🚀Artificial Intelligence
🚀Machine Learning
🚀Deep Learning
🚀Data Science
🚀Python R
🚀AR and VR
Dm @Aiindian

Ratings & Reviews

3.50

2 reviews

Reviews can be left only by registered users. All reviews are moderated by admins.

5 stars

0

4 stars

1

3 stars

1

2 stars

0

1 stars

0


The latest Messages 7

2023-09-10 17:31:28
Building LLM-Powered Agents

I really like how this figure summarizes the key components needed to build LLM-powered applications.

There are a lot of developers already building autonomous systems that achieve complex tasks and LLMs are at the heart of it all.

Prompting LLMs is one thing but you still need to figure out what information to pass to the LLM and how to process and use the information it's returning.

The key components for building LLM-powered agents include:

- Data Sources - for loading data or metadata

- LLMs - leverage models like GPT-4, Claude, or Llama 2

- Code Executor - for executing code and returning results

- Document Retriever - for embedding and retrieving documents

- Other ML Models - for performing other helpful ML tasks like prediction or forecasting

Not all components above are required but combining one or more can lead to all kinds of useful tools like search engines for knowledge bases, chat LLMs on legal documents, and customer support automation.

Learn more here
13.4K views14:31
Open / Comment
2023-09-01 17:29:01 Dear friends,

I’d like to share a part of the origin story of large language models that isn’t widely known.

A lot of early work in natural language processing (NLP) was funded by U.S. military intelligence agencies that needed machine translation and speech recognition capabilities. Then, as now, such agencies analyzed large volumes of text and recorded speech in various languages. They poured money into research in machine translation and speech recognition over decades, which motivated researchers to give these applications disproportionate attention relative to other uses of NLP.

This explains why many important technical breakthroughs in NLP stem from studying translation — more than you might imagine based on the modest role that translation plays in current applications. For instance, the celebrated transformer paper, “Attention is All You Need” by the Google Brain team, introduced a technique for mapping a sentence in one language to a translation in another. This laid the foundation for large language models (LLMs) like ChatGPT, which map a prompt to a generated response.

Or consider the BLEU score, which is occasionally still used to evaluate LLMs by comparing their outputs to ground-truth examples. It was developed in 2002 to measure how well a machine-generated translation compares to a ground truth, human-created translation.

A key component of LLMs is tokenization, the process of breaking raw input text into sub-word components that become the tokens to be processed. For example, the first part of the previous sentence may be divided into tokens like this:

/A /key /component /of /LL/Ms/ is/ token/ization

The most widely used tokenization algorithm for text today is Byte Pair Encoding (BPE), which gained popularity in NLP after a 2015 paper by Sennrich et al. BPE starts with individual characters as tokens and repeatedly merges tokens that occur together frequently. Eventually, entire words as well as common sub-words become tokens. How did this technique come about? The authors wanted to build a model that could translate words that weren’t represented in the training data. They found that splitting words into sub-words created an input representation that enabled the model, if it had seen “token” and “ization,” to guess the meaning of a word it might not have seen before, such as “tokenization.”

I don’t intend this description of NLP history as advocacy for military-funded research. (I have accepted military funding, too. Some of my early work in deep learning at Stanford University was funded by DARPA, a U.S. defense research agency. This led directly to my starting Google Brain.) War is a horribly ugly business, and I would like there to be much less of it. Still, I find it striking that basic research in one area can lead to broadly beneficial developments in others. In similar ways, research into space travel led to LED lights and solar panels, experiments in particle physics led to magnetic resonance imaging, and studies of bacteria’s defenses against viruses led to the CRISPR gene-editing technology.

So it’s especially exciting to see so much basic research going on in so many different areas of AI. Who knows, a few years hence, what today’s experiments will yield?

Keep learning!
Andrew NG
14.0K viewsedited  14:29
Open / Comment
2023-08-27 17:01:01 To start with Machine Learning:

1. Learn Python
2. Practice using Google Colab

Take these 2 free courses:

Introduction to Python Programming (Udacity)
Machine Learning Crash Course (Google)

If you need a bit more time before diving deeper, finish the following Kaggle tutorials:

Intro to Machine Learning
Intermediate Machine Learning

At this point, you are ready to finish your first project: The Titanic Challenge on Kaggle.

If Math is not your strong suit, don't worry. I don't recommend you spend too much time learning Math before writing code. Instead, learn the concepts on-demand: Find what you need when needed.

From here, take the Machine Learning specialization in Coursera. It's more advanced, and it will stretch you out a bit.

The top universities worldwide have published their Machine Learning and Deep Learning classes online. Here are some of them:

MIT 6.S191 Introduction to Deep Learning
DS-GA 1008 Deep Learning
UC Berkeley Full Stack Deep Learning
UC Berkeley CS 182 Deep Learning
Cornell Tech CS 5787 Applied Machine Learning

Many different books will help you. The attached image will give you an idea of my favorite ones.

Finally, keep these three ideas in mind:

1. Start by working on solved problems so you can find help whenever you get stuck.
2. ChatGPT will help you make progress. Use it to summarize complex concepts and generate questions you can answer to practice.
3. Find a community on LinkedIn, Telegram or 𝕏 and share your work. Ask questions, and help others.

During this time, you'll deal with a lot. Sometimes, you will feel it's impossible to keep up with everything happening, and you'll be right.

Here is the Good News:

Most people understand a tiny fraction of the world of Machine Learning. You don't need more to build a fantastic career in space.

Focus on finding your path, and Write. More. Code.

That's how you win.
(credits: santiago)
13.8K viewsedited  14:01
Open / Comment
2023-08-26 18:23:28 With more and more enterprise venturing into Generative AI the focus is now on engineering systems rather than building models alone

Models today are just an API or fine tuning away. For most tasks pre-trained models alone with some fine tuning does the job

Only in some specialized domain or in case of structured data one has to go through lengthy cycles of model development from scratch

If you want to get into NLP or computer vision field it is better to build your engineering skills rather than focussing on models alone (credits: Sriniwasan)
13.2K viewsedited  15:23
Open / Comment
2023-08-24 18:52:45
Python is the best programming language in the world.

And now, 1.1 billion Microsoft Excel users can use Python in their spreadsheets.

Advanced users will now be able to directly add Python code to the sheet and analyse it using the usual Excel formulae. Read more
13.7K viewsedited  15:52
Open / Comment
2023-08-23 18:39:06
How AI Helped Chandrayaan-3 Achieve Its Lunar Mission?

ISRO’s Chandrayaan-3, the third lunar mission has set history by touching down on moon’s surface.

During the last stage of its landing, the Chandrayaan-3 spacecraft has gone through a window of "17 minutes of terror", where it was carrying out a series of maneuvers which was crucial for landing. It included altitude adjustments, firing thrusters, & scanning the surface for any obstacles - all of that was done with the help of AI. During this period, the Chandrayaan-3 team was able to monitor its progress from the ISRO Telemetry, Tracking, & Command Network in Bengaluru, while Al was at the helm of the Vikram lander. ISRO has already confirmed that the lander used autonomously controlled by Al using Machine Learning that operated its guidance,navigation,control & other systems.

Lander & rover, as well as entire ship is designed & developed using AI, The spacecraft’s design is being optimized for weight, performance, and safety using AI algorithms.
13.0K viewsedited  15:39
Open / Comment
2023-08-21 16:07:01
Stanford University has just opened full access to CS224U.

Covers topics such as contextual word representations, information retrieval, in-context learning, behavioral evaluation of NLU models, NLP methods and metrics, and much more.

One of their immensely popular Natural Language Understanding course taught by Professor Christopher Potts.

Checkout GitHub & YouTube Playlist.
15.4K viewsedited  13:07
Open / Comment
2023-08-09 17:10:34 Generative AI is a multi-billion dollar opportunity!

There will be some winners and losers emerging directly or indirectly impacted by Gen AI

But, how to leverage it for the business impact? What are the right steps?

Clearly define and communicate company-wide policies for generative AI use, providing access and guidelines to use these tools effectively and safely.

Your business probably falls into one of these types of categories, make sure to identify early and act accordingly:

Uses public models with minimal customization at a lower cost.
Integrates existing models with internal systems for more customized results, suitable for scaling AI capabilities.
Develops a unique foundation model for a specific business case, which requires substantial investment.

Develop financial AI capabilities to accurately calculate the costs and returns of AI initiatives, considering aspects such as multiple model/vendor costs, usage fees, and human oversight costs.

Quickly understand and leverage Generative AI for faster code development, streamlined debt management, and automation of routine IT tasks.

Integrate generative AI models within your existing tech architecture and develop a robust data infrastructure and comprehensive policy management.

Create a cross-functional AI platform team, developing a strategic approach to tool and service selection, and upskilling key roles.

Use existing services or open-source models as much as possible to develop your own capabilities, keeping in mind the significant costs of building your own models.

Upgrade enterprise tech architecture to accomodate generative AI models with existing AI models, apps, and data sources.

Develop a data architecture that can process both structured and unstructured data.

Establish a centralized, cross-functional generative AI platform team to provide models to product and application teams on demand.

Upskill tech roles, such as software developers, data engineers, MLOps engineers, ethical and security experts, and provide training for the broader non-tech workforce.

Assess the new risks and hav an ongoing mitigation practices to manage models, data, and policies.

For many, it is important to link generative AI models to internal data sources for contextual understanding.

It is important to explore a tailored upskilling programs and talent management strategies.

What do you think?
29.1K views14:10
Open / Comment
2023-08-09 17:02:01
21.7K views14:02
Open / Comment
2023-08-07 09:03:05
"If someone edits your photo with AI or Photoshop to create a nude photo, then you go to https://www.stopncii.org/ and submit the original photo and the edited photo, then they will remove the edited photo from all the places on the Internet. You don't need to talk directly to anyone for this. Your identity will remain confidential.

If someone has made your picture viral like this, immediately inform the cyber security team, file a case with them and take immediate action "
26.6K viewsedited  06:03
Open / Comment