Get Mystery Box with random crypto!

Artificial Intelligence

Logo of telegram channel artificial_intelligence_in — Artificial Intelligence A
Logo of telegram channel artificial_intelligence_in — Artificial Intelligence
Channel address: @artificial_intelligence_in
Categories: Technologies
Language: English
Subscribers: 70.03K
Description from channel

AI will not replace you but person using AI will🚀
I make Artificial Intelligence easy for everyone so you can start with zero effort.
🚀Artificial Intelligence
🚀Machine Learning
🚀Deep Learning
🚀Data Science
🚀Python R
🚀AR and VR
Dm @Aiindian

Ratings & Reviews

3.50

2 reviews

Reviews can be left only by registered users. All reviews are moderated by admins.

5 stars

0

4 stars

1

3 stars

1

2 stars

0

1 stars

0


The latest Messages 7

2023-10-21 16:46:01 An hour spent improving your Software Engineering skills is more productive than an hour spent improving your Machine Learning skills.

I’m not saying ML is unimportant. Obviously it is.

I’m just saying I've seen more Data Scientists held back by their ability to deploy working software than by their ability to make proper modeling decisions.

Any Data Scientist can build a model in a Jupyter Notebook.

Fewer can take that model and deploy it in a production setting.

Even fewer can do this in a way that's is fault tolerant, scales well, and allows for easy iteration.

A strong understanding of SWE principles lets you build and deploy your models more efficiently and autonomously, which will better differentiate you from other Data Scientists.

Here's a shortlist of a few software engineering concepts I've found to be relevant with DS and ML:
1) REST and Micro-service architecture
2) Version control & CI/CD
3) Dependency injection
14.2K views13:46
Open / Comment
2023-10-16 16:13:01 If I had to start learning AI / ML all over again, this is what I would do differently:

𝐈’𝐝 𝐥𝐞𝐚𝐫𝐧 𝐏𝐲𝐓𝐨𝐫𝐜𝐡 𝐨𝐯𝐞𝐫 𝐓𝐞𝐧𝐬𝐨𝐫𝐅𝐥𝐨𝐰

TensorFlow is written in C++ and wrapped in Python. PyTorch is Python through-and-through. So it feels more natural. Also, the dynamic computational graph has an easier learning curve. So stick to PyTorch for getting started. You can learn TF later if you need to.

𝐈'𝐝 𝐥𝐞𝐚𝐫𝐧: 𝐭𝐨𝐨𝐥𝐬 𝐟𝐢𝐫𝐬𝐭, 𝐦𝐚𝐭𝐡 𝐚𝐧𝐝 𝐦𝐨𝐝𝐞𝐥 𝐭𝐡𝐞𝐨𝐫𝐲 𝐬𝐞𝐜𝐨𝐧𝐝

This contradicts almost every course I've ever come across. But I learn by DOING. Abstract concepts don't make sense to me unless I can apply them to real-world problems. So if I had to do it all over again, I'd start by learning how to use the most basic model possible (decision trees) to predict labels in the iris dataset. Then I'd practice on other datasets until I got comfortable, and then advance to logistic regression. Then I'd learn the math and theory behind logistic regression. Rinse and repeat.


𝐈’𝐝 𝐟𝐨𝐜𝐮𝐬 𝐨𝐧 𝐝𝐞𝐩𝐥𝐨𝐲𝐦𝐞𝐧𝐭 𝐦𝐮𝐜𝐡 𝐬𝐨𝐨𝐧𝐞𝐫
Success in ML is all about thinking through how your models can add value and then working backward. Traditional teaching leads to ML practitioners building models without a clear goal, which fails 95% of the time. By getting to deployment sooner, you train your mind to think closer to the end goal.

𝐈’𝐝 𝐨𝐧𝐥𝐲 𝐮𝐬𝐞 𝐫𝐞𝐦𝐨𝐭𝐞 𝐝𝐞𝐯 𝐞𝐧𝐯𝐢𝐫𝐨𝐧𝐦𝐞𝐧𝐭𝐬

Having a remote dev environment is SO much easier since you don't have to worry about hardware or local package environments. Examples: Google Colab, Amazon Sagemaker, PyCharm Remote Development.

I'd start using Mlflow immediately

Building a model that will live in a Jupyter Notebook forever is very different from building a model that needs to be deployed and maintained in production. By using a framework like MLflow early on, you'll instill good habits from the start.

I'd wait longer to learn advanced NLP

It's good to start with basic NLP concepts like topic modeling and word2vec, but I'd avoid generative-based NLP until I had a solid foundation in the basics and at least 1-2 years of experience in the field.
14.2K viewsedited  13:13
Open / Comment
2023-10-13 16:38:01
Introduction to Modern Statistics

If you are studying Machine Learning, it's worth every minute learning about Statistics.

This online book looks like a great place to start.

FREE PDF is also available.

This is an absolute gem!

openintro-ims.netlify.app
13.2K viewsedited  13:38
Open / Comment
2023-10-12 19:43:35
Large Language Models (in 2023)

An excellent summary of the research progress and developments in LLMs.

Hyung Won chung, OpenAI (ex.Google and MIT Alumni) made this content publicly available. It's a great way to catch up on some important themes like scaling and optimizing LLMs.

Watch his talk here and Slides shared here.
13.3K viewsedited  16:43
Open / Comment
2023-10-09 16:43:01 Top Platforms for Building Data Science Portfolio

Build an irresistible portfolio that hooks recruiters with these free platforms.

Landing a job as a data scientist begins with building your portfolio with a comprehensive list of all your projects. To help you get started with building your portfolio, here is the list of top data science platforms. Remember the stronger your portfolio, the better chances you have of landing your dream job.

1. GitHub
2. Kaggle
3. LinkedIn
4. Medium
5. MachineHack
6. Weights & Biases
7. DagsHub
8. DataSciencePortfol
9. HuggingFace
15.3K viewsedited  13:43
Open / Comment
2023-10-08 06:50:32 New developers: whenever you work on something interesting, write it down in a document which you keep updating. This will be very helpful when you need to create a resume or have to talk about your achievements in an interview. (Or for college essays.)

I can guarantee you that if you don't do this, you will forget half the interesting things you've done; and for a majority of us, our brains are experts in convincing us that we haven't really done anything interesting.
13.7K views03:50
Open / Comment
2023-10-03 00:18:22 In last 24 months, the way I write code has changed big time.

First, I've been using Copilot for almost two years now. I don't think about it anymore. It's already part of how I write code. It works.

How much faster am I because of Copilot?

That's a tricky question, but I estimate between 20% - 40% faster.

Copilot, however, is the tip of the iceberg.

I don't remember the last time I went to Stack Overflow. ChatGPT took that place in my workflow. Zero negativity, and it gives me precise answers about my codebase.

Here is everything I do with ChatGPT:

• It gives me ideas about which unit tests to write
• It helps me find dumb bugs I missed
• It explains code I don't understand
• It helps me write code in unfamiliar languages
• It writes about 80% of the documentation I need
• It gives me ideas on how to improve my code
• It mentions corner cases I should cover
• It answers my "How do I do this?" questions

I don't have access to the multi-modal version yet. I've seen some impressive demos where ChatGPT generates an application from a diagram. I can't wait to try it.

I've reviewed dozens of tools over the past year. Most build on top of Large Language Models to help developers. Here are some of the most notable areas these tools tackle:

• Smart assistants that integrate with notebooks and your IDE. Some of these go beyond what Copilot does. There's even a fork of a popular IDE that promises an AI-first experience.
• Assistant integrated with database servers. You can use them to build Machine Learning models without leaving your database.
• Assistants that help with code reviews. They offer suggestions, look for security vulnerabilities, and help with documentation.

None of these was possible two years ago.

I wrote my first line of code in 2012. I don't remember a period where I improved the way I write code that much.

These tools aren't perfect. They automate many tasks but are far from replacing me as a professional. Artificial Intelligence is not better, but it gives me superpowers.

Remember that AI will not replace you. A person using AI will.
14.3K viewsedited  21:18
Open / Comment
2023-09-26 17:08:01 Everything you should know about LLMs!
Jeremy Howard has created a code-first introduction to Large Language Models, basically covering the fascinating landscape and all important concepts in 90 mins, worth watching!


12.9K viewsedited  14:08
Open / Comment
2023-09-21 07:59:09
DALL·E 3 is here and it looks amazing!

Some thoughts:

From the announcement: "DALL·E 3 delivers significant improvements over DALL·E 2 when generating text within an image and in human details like hands. DALL·E 3 creates engaging images by default—no hacks or prompt engineering required."

Very curious to see to what extent we will rely on prompt engineering for this version. That has been one of the pain points for image generation systems, making them hard to use or get good results with.

OpenAI claims we won't need as much prompt engineering and that DALL·E 3 adheres better to prompt instructions and details.

Also, incredibly excited about the availability of DALL·E 3 in ChatGPT Plus and how this could improve the entire process of prompting and discovering interesting prompts to generate compelling and unique images. openai.com/dall-e-3
14.0K views04:59
Open / Comment
2023-09-19 10:31:56 Join our WhatsApp Channel: https://whatsapp.com/channel/0029Va8iIT7KbYMOIWdNVu2Q
12.8K viewsedited  07:31
Open / Comment