Get Mystery Box with random crypto!

Boring Berlin Scientist

Logo of telegram channel boberscience — Boring Berlin Scientist B
Logo of telegram channel boberscience — Boring Berlin Scientist
Channel address: @boberscience
Categories: Technologies
Language: English
Country: Not set
Subscribers: 698
Description from channel

Useful articles about Data Science, Machine Learning, Data Engineering and not only. A selection of material for learning is here https://github.com/slavadubrov/learning-material

Ratings & Reviews

2.50

2 reviews

Reviews can be left only by registered users. All reviews are moderated by admins.

5 stars

0

4 stars

1

3 stars

0

2 stars

0

1 stars

1


The latest Messages

2021-09-08 13:00:01 Best Practices on Recommendation Systems by Microsoft

- Github
608 views10:00
Open / Comment
2021-09-06 13:00:16 Applications of Graph Neural Networks
609 viewsedited  10:00
Open / Comment
2021-08-16 13:00:09 Netron - the crossplatform tool for visualizing deep learning models:

- Github
201 views10:00
Open / Comment
2021-08-13 13:00:04 The repo with the intuitive explanations, clean code and visuals about ML:

- GitHub
289 views10:00
Open / Comment
2021-08-11 13:00:04 The deep fake library for face swapping:

- GitHub
272 views10:00
Open / Comment
2021-08-09 13:00:03 The description of the PyTorch implementation of the paper Graph Attention Networks:

- The original post
280 views10:00
Open / Comment
2021-08-08 13:00:08 Recommendations with Bandit models from TF-agents library:

- The original blog post
- GitHub
284 views10:00
Open / Comment
2021-08-07 11:23:22 The introduction article about the new Deep Mind's multipurpose architecture Perceiver IO:

- The Original Blog Post
- The GitHub
254 views08:23
Open / Comment
2021-06-13 14:13:36
Chinese researchers are very fond of doing extensive surveys of a particular sub-field of machine learning, listing the main works and the major breakthrough ideas. There are so many articles published every day, and it is impossible to read everything. Therefore, such reviews are valuable (if they are well written, of course, which is quite rare).

Recently there was a very good paper reviewing various variants of Transformers with a focus on language modeling (NLP). This is a must-read for anyone getting into the world of NLP and interested in Transformers. The paper discusses the basic principles of self-attention and such details of modern variants of Transformers as architecture modifications, pre-training, and various applications.

Paper: A Survey of Transformers.
448 views11:13
Open / Comment
2021-02-23 14:00:03 data-describe is a Python toolkit for Exploratory Data Analysis (EDA)
728 views11:00
Open / Comment