Data Does Lie, Check Product Bias
There is no better time in developing markets and the current industrial revolution than right now to contribute to the discussion of the Fourth Industrial Revolution (4IR) and the alarming PR messaging that it has, some well-intentioned misdirection and the other half split with an overflow of information of which skillset to prioritise and which technology to employ. The job losses, the new technology and the illiteracy to name a few of this incoming era can indeed create a barrier of intimidation on entry, and what adds to the complexity of the situation is that the data lies.
Exploring Data Bias
You should be familiar with the notion of data being the new currency, at least in comparison to oil as an infinite resource that can empower economies. And data, having been undocumented, raw and undigitized has always been around, it is rather the scramble for the science and technology of it, and who gets access to it first that impacts the narrative and gets an opportunity to score some points for their industry, economy or group of privilege that they belong to. It’s the data scramble, it’s the data rush. This is what’s caused, I believe, the insurmountable backlash and inaccuracies, the product bias towards chatbots or products otherwise, whether its towards gender, race or access. The question that then follows up to this statement would be where the data is, and exploring the intentional bias and opportunities for solutions to the bias, and what stakeholders can do create inclusive economies.
The Impact of Bias Practices …
Machine Learning which is an application of Artificial Intelligence (AI) that studies the sciences of how machines can automatically learn and improve from experience by learning from themselves, is learning from the bias of the producers of the algorithms, and these makers of algorithms are largely white males as can be seen in an example of this through facial recognition products created by IBM, Microsoft and Face Plus Plus. That means that, so is the (informed) data, which breeds much room for prejudice.
A recent example of a sector that informed this bias is financial services, mostly with credit, and is now building the intelligence tools to either enforce or break away from this. In South Africa, usury expert Emerald van Zyl, claims that Standard Bank (including banks like First National Bank), which is Africa’s oldest bank is currently under hot fire for billing its black customers at a higher interest rate in financing. This is not the first time this occurred with Standard Bank, as in 2012 they were also charged with violating the National Credit Act where eventually customers were refunded by 2013. Now, if the machine learns these algorithms and continues to grant the same product bias, the discriminatory practices are more than likely continue.
This is kind of problem is also consistent in the health sector. In a New England Journal of Medicine article published on 15 March 2019, researchers of the Framington Heart Study showed the risk and capability of AI algorithms to demonstrate bias. The research used AI to predict the risk of cardiovascular occurrences in non-white populations and the results demonstrated bias in both over- and underestimations of risk.
People's lives are at stake through the products of 4IR. And, beyond the glitter of Sophia The Robot and the New Generation Kiosks at companies like McDonalds, there is a community that is not being intentional about being inclusive and rather duplicating structural socio inequalities that implicates another.
Data bias does only one thing, it mirrors what is socially ingrained, which means that it lies and tells a partial truth, of which is not meant for consumption by those who produce it.
Dismantling the Structural Bias
The call for inclusive economies goes beyond teaching young, black girls how to code and having strictly women only data science clubs. Practices like hiring more diverse teams leads to impactful and informed product creation and is a good contribution to mitigate prejudice algorithms and encourage more accurate data on a model. A sub-division of AI, Natural Language Programming (NLP) is a study that is concerned with the processing of computers and human natural languages, and can be used as a great example and opportunity for the necessity of the inclusive call in the sector. Translating open source of data sets in different parts of the world requires an understanding of the language being translated so that we can not only have Siri being able to understand my instructions in English but also the opportunity to preserve and digitise languages like the Khoi which are diminishing, mostly because, especially with African languages, the impartation of language happens orally. A great example of this opportunity is Ajala Studios, which a Nigerian startup that builds natural language and speech processing applications for African languages, which means that they can too synthesize speech from African languages presented as digitized text, a gap that’s mostly recognises Western accents, voices and names.
The responsibility of creating these opportunities is also a shared responsibility, especially with the public sector. Governments in both developed and developing markets need to invest more in Research and Development (R&D) and in the social concept of open innovation (engaging the public with the data) especially as the impact of this investment is quite telling. And although it is a long term investment, the return on this investment is worthwhile. Researchers from the United Kingdom (UK) and Saudi Arabia looked at 40 Asian counties and how their spend on R&D lead to the production of quality research publications across sciences and social sciences; and with more research in the UK showing the positive impact that public investment has in the increment of private sector investment and in attracting foreign direct investment. Through this R&D investment and its impact in the knowledge economy, it also presents an opportunity to lead to more computational intelligence and feeding it the missing data, and the greater economic impact through the indicator of Gross Domestic Product (GDP).
The next solution is not only costly but risky, but if there is one thing that I’ve learnt about being in the innovation space, whether the product is out to market or still in the proof of concept phase, no matter how good it looks on paper, it’s that it is never too late to take the product off market if it doesn’t serve its purpose. A great example of this is Vodacom South Africa’s failure, thrice to launch its sister Kenyan network SafariCom’s M-Pesa to the South African market. Factors like an onerous regulatory environment, the competitive advantage that the larger and established banks have with their products to low-income consumers, and some have also argued due to the mixed messaging upon launching, from introducing it as a mobile money wallet to a platform that is linked to your VISA card. This case study is also an example of the danger of wanting to copy and paste a one-size fits all product into an Africa that is not a country.
At the end of the day, it's about investing in the visibility of the communities so as to include better, impactful and innovative products and profitability for all ecosystem stakeholders a part of the operation chain.
The data samples ARE there. And unfortunately (or an opportunity), so is the bias. But all is not lost, not with the desire of visionary stakeholders to operate in a transformative world that uses the enabler of technology for sustainable good business.
Leave a comment
Make sure you enter all the required information, indicated by an asterisk (*). HTML code is not allowed.