Why more isn't always better Podcast By  cover art

Why more isn't always better

Why more isn't always better

Listen for free

View show details

About this listen

In this episode we tackle AI training data.

More specifically the thought that more data will always improve your models. From overfitting and noisy datasets to issues of scale, labeling, and ethics you'll understand that with AI models Quality is almost always better than quantity.We tackle problems where lack of good data leads to real issues and the ethicality of using widely available data on the internet without consent.



No reviews yet