Cover image of Data Science at Home
(41)

Rank #197 in Technology category

Technology
Podcasting

Data Science at Home

Updated 8 days ago

Rank #197 in Technology category

Technology
Podcasting
Read more

Technology, machine learning and algorithms

Read more

Technology, machine learning and algorithms

iTunes Ratings

41 Ratings
Average Ratings
24
10
2
2
3

Yes

By stevensforjesus - Nov 04 2017
Read more
This is so good

Very well done

By Charliea472 - Aug 02 2016
Read more
Very clear explanations of data science concepts.

iTunes Ratings

41 Ratings
Average Ratings
24
10
2
2
3

Yes

By stevensforjesus - Nov 04 2017
Read more
This is so good

Very well done

By Charliea472 - Aug 02 2016
Read more
Very clear explanations of data science concepts.
Cover image of Data Science at Home

Data Science at Home

Updated 8 days ago

Rank #197 in Technology category

Read more

Technology, machine learning and algorithms

Rank #1: Episode 72: training neural networks faster without GPU

Podcast cover
Read more

Training neural networks faster usually involves the usage of powerful GPUs. In this episode I explain an interesting method from a group of researchers from Google Brain, who can train neural networks faster by squeezing the hardware to their needs and making the training pipeline more dense.

Enjoy the show!

 
References

Faster Neural Network Training with Data Echoinghttps://arxiv.org/abs/1907.05550

Aug 06 2019
22 mins
Play

Rank #2: Episode 53: Estimating uncertainty with neural networks

Podcast cover
Read more

Have you ever wanted to get an estimate of the uncertainty of your neural network? Clearly Bayesian modelling provides a solid framework to estimate uncertainty by design. However, there are many realistic cases in which Bayesian sampling is not really an option and ensemble models can play a role.

In this episode I describe a simple yet effective way to estimate uncertainty, without changing your neural network’s architecture nor your machine learning pipeline at all.

The post with mathematical background and sample source code is published here.

Jan 23 2019
15 mins
Play

Rank #3: Episode 43: Applied Text Analysis with Python (interview with Rebecca Bilbro)

Podcast cover
Read more

Today’s episode is about text analysis with python. Python is the de facto standard in machine learning. A large community, a generous choice in the set of libraries, at the price of less performant tasks, sometimes. But overall a decent language for typical data science tasks.

I am with Rebecca Bilbro, co-author of Applied Text Analysis with Python, with Benjamin Bengfort and Tony Ojeda.

We speak about the evolution of applied text analysis, tools and pipelines, chatbots.

Aug 14 2018
36 mins
Play

Rank #4: Episode 71: How to scale AI in your organisation

Podcast cover
Read more

Scaling technology and business processes are not equal. Since the beginning of the enterprise technology, scaling software has been a difficult task to get right inside large organisations. When it comes to Artificial Intelligence and Machine Learning, it becomes vastly more complicated. 

In this episode I propose a framework - in five pillars - for the business side of artificial intelligence.

Jul 30 2019
13 mins
Play

Rank #5: Episode 70: Validate neural networks without data with Dr. Charles Martin

Podcast cover
Read more

In this episode, I am with Dr. Charles Martin from Calculation Consulting a machine learning and data science consulting company based in San Francisco. We speak about the nuts and bolts of deep neural networks and some impressive findings about the way they work. 

The questions that Charles answers in the show are essentially two:

    Why is regularisation in deep learning seemingly quite different than regularisation in other areas on ML?
    How can we dominate DNN in a theoretically principled way?

 

References 
Jul 23 2019
44 mins
Play

Rank #6: Episode 63: Financial time series and machine learning

Podcast cover
Read more

In this episode I speak to Alexandr Honchar, data scientist and owner of blog https://medium.com/@alexrachnogAlexandr has written very interesting posts about time series analysis for financial data. His blog is in my personal list of best tutorial blogs. We discuss about financial time series and machine learning, what makes predicting the price of stocks a very challenging task and why machine learning might not be enough.As usual, I ask Alexandr how he sees machine learning in the next 10 years. His answer - in my opinion quite futuristic - makes perfect sense. 

You can contact Alexandr on

Enjoy the show!

Jun 04 2019
21 mins
Play

Rank #7: Episode 38: Collective intelligence (Part 1)

Podcast cover
Read more

This is the first part of the amazing episode with Johannes Castner, CEO and founder of CollectiWise. Johannes is finishing his PhD in Sustainable Development from Columbia University in New York City, and he is building a platform for collective intelligence. Today we talk about artificial general intelligence and wisdom.

All references and shownotes will be published after the next episode.Enjoy and stay tuned!

Jul 12 2018
30 mins
Play

Rank #8: Episode 44: The predictive power of metadata

Podcast cover
Read more

In this episode I don't talk about data. In fact, I talk about metadata.

While many machine learning models rely on certain amounts of data eg. text, images, audio and video, it has been proved how powerful is the signal carried by metadata, that is all data that is invisible to the end user.Behind a tweet of 140 characters there are more than 140 fields of data that draw a much more detailed profile of the sender and the content she is producing... without ever considering the tweet itself.

References You are your Metadata: Identification and Obfuscation of Social Media Users using Metadata Information https://www.ucl.ac.uk/~ucfamus/papers/icwsm18.pdf

Aug 21 2018
21 mins
Play

Rank #9: Episode 67: Classic Computer Science Problems in Python

Podcast cover
Read more

Today I am with David Kopec, author of Classic Computer Science Problems in Python, published by Manning Publications.

His book deepens your knowledge of problem solving techniques from the realm of computer science by challenging you with interesting and realistic scenarios, exercises, and of course algorithms. There are examples in the major topics any data scientist should be familiar with, for example search, clustering, graphs, and much more.

Get the book from https://www.manning.com/books/classic-computer-science-problems-in-python and use coupon code poddatascienceathome19 to get 40% discount.

 

References

Twitter https://twitter.com/davekopec

GitHub https://github.com/davecom

classicproblems.com

Jul 02 2019
28 mins
Play

Rank #10: Episode 36: The dangers of machine learning and medicine

Podcast cover
Read more

Humans seem to have reached a cross-point, where they are asked to choose between functionality and privacy. But not both. Not both at all. No data, no service. That’s what companies building personal finance services say. The same applies to marketing companies, social media companies, search engine companies, and healthcare institutions.

In this episode I speak about the reasons to aggregate data for precision medicine, the consequences of such strategies and how can researchers and organizations provide services to individuals while respecting their privacy.

Jul 03 2018
22 mins
Play

Rank #11: Episode 48: Coffee, Machine Learning and Blockchain

Podcast cover
Read more

In this episode - which I advise to consume at night, in a quite place - I speak about private machine learning and blockchain, while I sip a cup of coffee in my home office.There are several reasons why I believe we should start thinking about private machine learning...It doesn't really matter what approach becomes successful and gets adopted, as long as it makes private machine learning possible. If people own their data, they should also own the by-product of such data.

Decentralized machine learning makes this scenario possible.

Oct 21 2018
28 mins
Play

Rank #12: Episode 61: The 4 best use cases of entropy in machine learning

Podcast cover
Read more

It all starts from physics. The entropy of an isolated system never decreases… Everyone at school, at some point of his life, learned this in his physics class. What does this have to do with machine learning? To find out, listen to the show.

 

References

Entropy in machine learning https://amethix.com/entropy-in-machine-learning/

May 21 2019
21 mins
Play

Rank #13: Episode 52: why do machine learning models fail? [RB]

Podcast cover
Read more

The success of a machine learning model depends on several factors and events. True generalization to data that the model has never seen before is more a chimera than a reality. But under specific conditions a well trained machine learning model can generalize well and perform with testing accuracy that is similar to the one performed during training.

In this episode I explain when and why machine learning models fail from training to testing datasets.

Jan 17 2019
15 mins
Play

Rank #14: Episode 39: What is L1-norm and L2-norm?

Podcast cover
Read more

In this episode I explain the differences between L1 and L2 regularization that you can find in function minimization in basically any machine learning model.

Jul 19 2018
21 mins
Play

Rank #15: Episode 45: why do machine learning models fail?

Podcast cover
Read more

The success of a machine learning model depends on several factors and events. True generalization to data that the model has never seen before is more a chimera than a reality. But under specific conditions a well trained machine learning model can generalize well and perform with testing accuracy that is similar to the one performed during training.

In this episode I explain when and why machine learning models fail from training to testing datasets.

Aug 28 2018
16 mins
Play

Similar Podcasts