“I worried I was weaponising people”: global data expert, Professor Galit Shmueli

Artificial Intelligence simply doesn’t understand “fairness” and teaching advanced data analysis means ensuring people “have a heart as strong as their brain”.

Artificial intelligence, automation, digital transformation – these are some of the hottest trends in business. But at the centre of these developments are issues around big data – how it’s gathered, stored and interpreted.

In this episode of Impact Exchange, Monash Business School’s Professor Richard Hall speaks to one of the world’s foremost experts in data analytics, Distinguished Professor Galit Shmueli from National Tsing Hua University Taiwan, about some of the implications around the ethical use of big data.

As the Director of the Center for Service Innovation & Analytics at NTHU’s College of Technology Management, Professor Shmueli has been working with academics from different disciplines to help to educate students on business analytics topics such as data mining, forecasting analytics, interactive visualisation and research methods.

Her teaching and research focus is applying novel statistical methodology and adapting existing methods for modern data structures, with papers such as “To Explain or To Predict?” and “Predictive Analytics in Information Systems Research” attracting global attention.

“I’ve been teaching data mining from early on. A course that I taught at Maryland in 2004 was one of these first courses taught in business schools and that’s why I had to develop a textbook because there were no textbooks at the time,” she says.

But she says that teaching students how to analyse sets of data is akin to “weaponising them” as they go into a world where privacy and ethical behaviour do not always go hand-in-hand.

“I kept having this nagging and uncomfortable feeling that I’m teaching a very powerful tool kit to MBA students, but I’m not really sure how they will be using it.”

She believes that when you have powerful weaponry – you want to make sure the people you are giving it to will l have the notion of how these things can be used for good or bad.

“We need to ensure these people “have a heart as strong as their brain,” she says.

“You have to have this combination of really wanting to be empathetic, wanting to put yourself in the other person’s shoes. If I’m the person who’s a line in the data series how would I react? If we don’t have that empathy, it’s really hard to think about where those algorithms are going to go.”

 width=

Professor Richard Hall with Professor Galit Shmueli.

Life journey

Born in Israel, Professor Shmueli’s life journey flows across not only geographies but also across different disciplines. After graduating with a PhD in statistics she flew to the US and was at Carnegie Mellon University in the Department of Statistics for about two years.

“I started working on interesting projects with colleagues both in the marketing department. That was the first time I ever talked to marketers because I came from an engineering school,” she says.

“I was also working with people in public health epidemiology bioinformatics and we were working on bio-surveillance work, so that was the beginning of my journey with data. I tried to stay away from data during my studies but that didn’t happen!”

She then accepted a job at the University of Maryland’s Business School and has been in business schools ever since.

After a one year of sabbatical in the kingdom of Bhutan, situated in the eastern Himalayas, she decided to go back and live for five years where she had her research but was also doing a volunteer work developing technology for the education sector in Bhutan.

“Having my foot outside of academia for the first time I began to see a lot of different things,” she says

“Bhutan, it’s a very extreme place compared to the United States, compared to Israel, compared to the place I live right now, Taiwan. I think it’s different in every way. You can converse with people in English, communication was not a problem, but it’s the whole mindset that is very different.”

Professor Shmueli realised you cannot simply ignore the cultural differences when using big data.

Understanding cultural sensitivities

“By living in a place, you start to understand the culture and even after you live somewhere for about five years, you just start getting some of the ideas,” Professor Shmueli says.

She believes the best way of designing some solutions for a company or country is to immerse yourself in a place.

“Obviously, this is not happening if you are working in a large American multinational corporation developing algorithms in San Francisco for the whole world.

The larger problem is the flip side of this – that if you understand what’s going on you can easily manipulate it.

Professor Shmueli says that if you look at what’s happening on the political front, particularly with elections, there are manipulations everywhere.

“There are some countries who understand very well how other countries are working and are therefore easier to manipulate,” she says.

So, it’s very tricky: “I can tell you that this question of using powerful techniques like machine learning is a very, very tricky.”

Monitoring bodies or ethics?

“Nobody likes monitoring bodies. If I say the word ethics you get a knee jerk reaction from a lot of people. You don’t get it when you talk to people who are philosophers because, for them, ethics has a different meaning. But if you talk to technical people they think that means regulations,” she says.

“I think of education in terms of values – and I don’t mean a course that says ethics. I think it needs to be built much deeper into education. Perhaps students taking some humanities – we need to think about what it means to be human and what it means to live in a society and that has to be done in a way that’s appealing.”

 width=

How to code concepts

Professor Shmueli also discusses the issues with Artificial Intelligence and coding or tagging the data where human decisions are made.

“How can the data understand concepts such as fairness?” she says.

“Fairness means something different to us that you can’t really code in an algorithm. What’s coded into machine learning is a very specific and narrow thing. So I think there’s a really big gap between what people perceive in a lot of fields that are not technical including the legal systems and the social sciences.  But what we need to do and what we are learning is, in statistics, we code them into something very very specific.”

For example, she asks: “If you say someone was ‘late’ what does ‘late’ mean? Late has to have a certain time duration. And for different cultures, it can mean very different timeframes before you are considered to be late.

But for AI, coding these concepts means that there needs to be space for judgement. While this can be built into the code, she says “it requires human intervention.”

Professor Shmueli and her multidisciplinary team at NTHU are trying to figure out why things are going wrong and when they go wrong. She says some people call it algorithmic bias, but it is human bias that basically gets coded into the algorithm.

Published on 29 Jan 2020

More