Back to overview

Torch: Season 1, Episode 8

How to weaponize AI? Biases & Ethical models explained

Algorithms are unbiased. When I first started with data science, I was convinced that this was true. Now I know that an algorithm is only as good as its data. And can data be biased? In my Torch Talk, I will show you how uneven aggregation and interpretation of data can influence data science projects.

 

Download Whitepaper

In this torch episode, I will show you:

  • Why data quality is key in any data science project.
  • How by using biased historical data, an algorithm will accentuate this bias even further.
  • What you can do to prevent your algorithm from generating a biased output.

Hi everyone, I am Luca Furrer, and I work as a data science und data analytics consultant at Trivadis – Part of Accenture. Being interested in both applied mathematics and software development, I enjoy working at the intersection of data, math and algorithms and finding the right tool for every job. When I am not juggling data, I go jogging, spend time with my children and cook for my family. Did you enjoy my Torch talk? Then let’s get in touch – I would love to pass the torch and discuss the world of data science further!

Takeaway

Algorithms already take a lot of decisions in our lives. Some are less crucial (e.g. which films are recommended to us on Netflix), others have more serious consequences (e.g. who is granted a loan and who isn’t). How can we make sure that, especially in the latter case, algorithms make unbiased decisions? In my takeaway, you will find a hands-on checklist that you can use as a basis for your own data science project.

Download now

MORE EPISODES

Your contact