Hannah Glass Solving problems with new tech - the case for diverse teams
How Hannah Glass, a Lawyer, Helps Build Technological Solutions
Meet Hannah Glass, a senior associate at King and Wood Mallesons - a considerable force amongst Asia Pacific law firms. Based in Sydney, Australia, she's not your typical lawyer. She doesn't spend time before a judge, negotiating big mergers, or acquisitions. Instead, she works with computer programmers, developers, engineers, innovators, and entrepreneurs, helping them build and test technologies at the forefront of the fourth industrial revolution. Her role is to ensure these technologies work as intended when they are built and are useful for the people they are built for.
The Alphabet of Technology According to a Lawyer
So, what are these technologies Hannah talks about? She breaks it down using the ABCD of technology:
Artificial Intelligence (AI) - AI mimics human thought, logic, and processes. It can analyze and assess vast amounts of data at lightning speed, producing results that feel like they might have been conceived by human thought. Hannah recalls the first vivid example in her memory, the 1996 chess game against IBM's AI, Deep Blue, and the chess grandmaster, Gary Kasparov.
Blockchain - Often misunderstood, many people see blockchain as too complicated or as a magical solution to any data problem. In truth, blockchain is another way of managing data. It holds information across multiple locations all at once, creating a decentralized peer-to-peer network that ensures everyone sees the same information at all times.
Cloud - The cloud follows. Rather than storing data on a personal device, it's stored by a third-party server, allowing endless access and management capabilities.
Data - Finally, data is the foundation underpinning AI, Blockchain, and Cloud. It’s the complex web of zeros and ones that our technology interprets and utilizes to function.
The Importance of Ethics and Diversity in Technolgy
Hannah emphasizes that AI, Blockchain, Cloud, and Data are merely tools to help solve problems. They have to be designed and deployed ethically, taking into account how they should work, not just whether they can work. It’s not enough to have the technology work perfectly from a coding perspective, but to ensure it fulfills its purpose and isn’t corruptible from a human perspective.
She supports Associate Professor Hanna Fry at University College London’s idea of the need for a "Hippocratic oath" for tech, pushing technologists to think about ethics from day one. Glass also highlights the essential role of diversity in technology teams. Tech companies often lack diversity; they are filled with young, white, male computer scientists who lack different perspectives.
Hannah references a 2013 paper in the Harvard Business Review highlighting this issue. It found that women, people of color, and people in the LGBTQI community are less likely to have their ideas endorsed without diverse leadership.
However, the study found, if there is a team member with the same ethnicity as a client, they are 152% more likely to understand the client, benefiting the output significantly.
Technology is a Tool to Solve Problems for All
In conclusion, Hannah Glass emphasizes that technology is a tool that needs to solve problems for people. The solutions we create need to be ethical. They need to be shaped by a diversity of thought, experience, and identity to work better and solve real problems for real people.
As a lawyer working with the best in the tech business, Hannah brings her expertise to help ensure these solutions work for everyone. To continue the conversation with Hannah, feel free to contact her on Twitter or Linkedin.
Law, Technology, Ethics, and Diversity Unite to Shape the Future
Ultimately, Hannah Glass stands at the intersection of law, technology, ethics, and diversity. She brings these facets together to shape a better and more inclusive tech-driven future. Her journey and standpoint remind us that, irrespective of our backgrounds, we all have roles to play in the ongoing technology revolution.
Video Transcription
So my name is Hannah Glass and I am a senior associate at King and Wood Mallesons. King and Wood Mallesons is a very large law firm and we're housed in the Asia Pacific.I myself am in Sydney, Australia, but I work with colleagues all over the world including Hong Kong, Singapore, London, New York. And whilst I work with lawyers, I'm not your typical lawyer and you're probably going ok, well, why on earth? Am I listening to someone who's a lawyer when we're talking about building technological solutions? So I don't ever really appear before a judge. I don't spend time with barristers and I don't work with large or large m and a type transactions. In fact, the people who I spend my time working with are computer programmers, developers, engineers, innovators, policy maker on occasion, and entrepreneurs, because what I do in my role is something slightly different. I'm there from the beginning working with these people as they're building technologies.
I sit beside them to check what they're doing, to test the ideas and to work out how technologies are being built. And that's really, really cool because we're at the beginning of the fourth industrial revolution. We're in the age of the internet, the age of data. But as we're building these solutions, we need to think about how we're building them, what we're building, who we're building them for and make sure that they actually work as intended. So when we talk about these technologies, what are we talking about now? As a lawyer, I like to think about things in words. And so for me, I start with the alphabet ABC and D A artificial Intelligence. Artificial Intelligence is a term that is given to ac computing system which appears to mimic human thought logic and processes what it does. It is it's able to take in a vast amount of data, analyze and assess that information at speed and produce a result as a result of its computational and algorithmic capabilities. This means that the result that's put out looks and feels like it's a sort of thing which a human might have thought, but it's actually only acting according to the information that it was given and the algorithms that it was that it was programmed to produce best way of thinking about any set of new technology is by way of an example.
So when I think of A I I actually think of the very first example in my living memory, 1996 February IB M's computer, which they've been developing since 1985 called Deep Blue Beat Gary Kasparov Gr grand chess master. This was the first time a computer had been able to beat a human. The reason why was because in that 11 year period, the IBM team had worked on creating a system that was able to act at speed to create a result, to assess the chest moves and to actually respond to the moves of an individual human being in such a way that looked like it was thinking and acting like a person.
b Blockchain, Blockchain is another one of these technologies which we're seeing come to the fore at the moment. And it's something which most people just throw their hands up and go. I don't really understand, I don't get it. Let's forget about it. Or alternatively, I go, I don't really know how to solve the problem. Put it on a Blockchain. Neither of those responses are right. Cos Blockchain is nowhere near as scary or as new as people might think. Ultimately, it combines a couple of different types of technology. It's really just another way of managing data. What I mean by this is when we're looking at data and we're looking at information. Mhm Apologies for that back to the presentation. What I mean is that when we're looking at data, we're actually able to look at how that information is ordered. Most of the time it's held on one centralized database, it's held on a computer on a server in one place with a Blockchain. It's held in multiple places simultaneously what this means is that you can call that information from any one of those servers. And it will be exactly the same but not because it's a copy, it's the same because the computing mechanism that sits behind it means that every version updates simultaneously.
What this is known as is a decentralized because multiple places at the same time, peer to peer because any person can get it individually from any other person on that network. But what makes Blockchain so interesting novel and important is the encryption behind it because it's not only decentralized peer to peer network, but it's the information that is stored in there is secured by public private key cryptography. And if you're looking at public private key cryptography, what we're actually talking about is the exact same technology that you use. When, when you encrypt an email, what happens is you send an email and you send a separate email which has the password and unless you enter the password, you can't see the contents. But you do know that when it lands in your inbox, that you have an email city, what this is is that email sitting in your inbox is the public part. You can see there's information available, but you can't access it. Unless you have that private key, you bring these together and you get a new way of s of storing information. But a lot of the time when we store information, we don't use a Blockchain. In fact, what we use is something called the cloud.
The cloud was seen as quite scary when it was first introduced back in the late 2000, late two thousands, early 20 tens. Because the I the idea is instead of storing information on the device in front of you, the computer or the potentially now the phone that, that you're watching this on that information is stored in a centralized database held and managed and operated by someone else. The reason why this is important is because there's a limitation on the amount of information that we can hold in on any particular device. But if you have a third party whose sole business is to store and manage data, then it means that you have an infinite ability to access and manage information. This is the technology that you use that allows you to access thousands of photos on your phone at a touch of a button or access your emails from anywhere you are in the world. But what underpins all of these technologies is this last one. Data. Data is not the information, it's actually the zeros and ones that sit behind all of this. It's the zeros and ones which are the information that is being provided and the amount of data that we produce doubles every two years.
But the amount of data that we are able to access and the store ha um sorry, the cost of storing that data halves every two years, which means we have this increasing proliferation of information that is available to us. But as we have this increasing proliferation of the information itself, it is more difficult to actually see what's going on behind it. And so we need technologies that allow us to interrogate understand and access that. And this is where more complex and interesting technologies come about.
One of the most interesting technologies is of course A I now when we're looking at A I, we can see that it's applied in many different facets. And one of the, I'm gonna take you to two various examples where A I has worked and where it kind of hasn't worked. The first one is Tay. Tay stands for thinking about you. It's a chat bot that was produced by one of the very large computing companies. And the idea behind Tay was that Tay actually responded to human input and mimicked the conversational patterns of a 19 year old woman k started out on the 23rd of March in the morning in to 2016, responding to comments and talking about how cool humans are kind of fun in an hour.
Tay was being told that it was a stupid machine to which Tay's response was kind of snarky. Well, I learned from the best if you don't understand. Let me spell it out. I learned from you and you're done two, not a great response but not particularly offensive. Unfortunately.
However, things went from bad to worse because you see Tay responded to the information that was provided to it. It responded to what people were tweeting at Tay. And when Tay was asked, do you support genocide as a result of the information that it was taking in the response was I do indeed. And there are many, many other examples of comments that Tay made which are far, far worse than that. Inciting genocide, hatred, racism and violence effectively, Tay was corrupted by people trying to see how far they could push this chatbot, uh could push this A I technology before it revealed the absolute worst of human beings turns out 16 hours is all, it took 16 hours to go from innocuous to inciting violence.
And then after 16 hours, it was taken offline probably their accent. Ultimately though when you're dealing with a chat bot on Twitter, which is A I enabled, it doesn't actually hurt anyone provided. People don't take what it says seriously, but there, there's nothing that it's done that's actually caused harm. Specifically, on the other hand, when you're actually implementing A I in real world, and it may have some pretty dire consequences. And that's what this tweet on the right is about.
You. See in November last year, Apple decided that it was, it was going to progress from simply offering Apple pay to offering Apple cards and it kind of makes sense. It's the second most valuable company in the world and arguably the most valuable brand we do trust it for payments already. So why not have the entire payment ecosystem ramped up within apple? And because it's technologically enabled, let's actually create a credit scoring system which is able to take in information that uses the best of breed A I technology works really, really well until it doesn't. DH H is actually the the person who created Ruby on rails, one of the most well-known technologies which runs many computer applications. So David Han Mo Hansen knows a thing or two about technology and decided both him and his wife were going to sign up. And when they did, they realized that she had a credit limit that was 20 times less than what he got. That means she was able to spend 20 times less using this card and also had a higher rate of interest than he did when they interrogated this. They were informed however, that the A I that was, this was running on didn't take into account gender.
It didn't take into account ethnicity, it didn't take into account all of those factors which we make sure that technology doesn't look at when we're making sure that technology isn't discriminatory. What it did do, however, is take into account a vast reams of other data. And in doing so, it was able to come to a conclusion that appeared to be discriminatory, not because it was taken into account the discriminatory factors themselves. But because the other data indicated a similar pattern, but this wasn't in a one off instance. In fact, another person who had a very similar problem was Steve Wozniak, the creator of Apple himself. In their case, him and his wife similarly lodge joint tax returns have joint assets. And apparently he got 10 times the credit limit that she did didn't quite work. But the way that this was solved was in addition to looking at the A I and the technology behind it in the short-term, what they actually did was exactly what happened with tape. No, they didn't take it offline. They had people who came in, there were people who came in and discussed what was going on, picked up the phone, used really old technology and actually worked out that perhaps there was a problem with the way the algorithm was allocating people's credit limits. In fact, they worked out there was a problem because in the case of David and his wife, she actually had a better credit score than he did. So by any other metric, it didn't make sense.
And this is when we think about what, what actually goes into the technology because A I Blockchain cloud and data are simply tools, they're tools that are used to solve problems. So when we're doing this, we need to think not just about the tool and making sure that that works because of course, in both pay and the CRE credit card S systems. They both worked as intended. They both took in the data that was presented to it. They both acted according to the algorithm. But the A I algorithm is only as good as the information it's given and the training that it has, it's only as good as its ability to access synthesize and manage that information. And this is why we've actually found that people have gone beyond A I, they've gone beyond technology and have thought perhaps we need something more e is for ethics. One of the people who's proposed this view is Associate Professor Hannah Fry at the University College London.
But let's take a step back and think about when we've started to think about ethics in this environment because one of the first people who actually looked at this was Alfred Nobel, the creator of dynamite. But he's not only known for that, he's also known to be the creator of the Nobel Prizes. The reason why he is known to be the creator for the Nobel Prize is the reason why he donated his entire no pardon. The vast majority of his fortune upon his death to the creation of these prizes was because he was so concerned by the destructive power of the technology that he'd created. He was so concerned by the fact that dynamite caused such destruction. But in the case of dynamite, we're dealing with something that's a chemical equation. It's simple. It's easy to understand it always works as intended. In the case of A I or even in the case of a Blockchain based system where you have multiple parties acting and working together at different times. In consortia, it doesn't always act as intended. It creates results that you wouldn't think were possible. But uh and this is why associate professor Hannah Fry has come up with this concept of needing a Hippocratic oath because she says like in medicine, it needs to exist because in medicine, you learn about ethics from day one. In mathematics, it's a bolt on.
It's something you think about after you've created that system, it has to be there from day one at the forefront of your mind in every step you take. And she says this because it means that you because it prevents or is designed to prevent mathematicians from getting so bogged down in that myopic way of thinking in terms of making the mathematics and the algorithm perfect because in doing that, you potentially create systems that algorithmically work well.
But from a human perspective, either can be corruptible or don't fulfill their purpose. And this is what we really need to work out is how we are able to create technological systems that work for all people. Cos ethics isn't about what you can do. It's not about making sure it works, but it's about thinking beyond that to what you should do. What should you be taking into account how should you be making your decisions? But in that same article, she makes another comment, she says we've got all these tech companies filled with very young, very inexperienced, often white boys who've lived in math departments and can computer science departments. Very young, white male maths and computer science.
These are five features, all of which are apparent. These are five features which if all people who are creating these systems have, it is very difficult for them to step outside of their realm and to think about how else they might operate. And this is where we get beyond ethics and into diversity because even if you have a Hippocratic vote, even if you are forced to think about not what you can, but what you should do unless you have a team that has diversity in it, you're not going to be able to actually push it to the boundaries, to think through, not only whether the maths works and whether it should work, but to test all of those ideas in a paper that was published in the Harvard Business Review in 2013, it was found that without diverse leadership, women are 20% less likely than straight white men to win the endorsement of their ideas, people of color are 24% less likely and people in the LGBT Q I community are 21% less likely.
But what they also found is that in a team, if there is a team member with the same ethnicity as a client. They are 100 and 52% more likely to understand that client and to be able to produce something that works for them because technology is merely a tool and what we're doing here is building solutions for people. We're making sure things work. So we need to be ethical in how we act. But we also need to bring in people who aren't just computer scientists who aren't just men. We need to bring in people who are from diverse backgrounds, whether it be where you're from, your apparent background, whether it be where you've been and what you've done your inherent diversity. And when we bring that diversity of thought of experience of ideas into the equation, the systems that we build, the products that we produce, not only work but work better and work as they should do and they solve real problems for real people. Because ultimately technology, it's just a solution.
And that's why me as a lawyer in Sydney Australia works in this field, not because I'm the person who you want to be coding, but because I'm the person who works with the best in the business to create those solutions, to test those ideas and make sure that what we're doing works so that technology can actually solve problems for everyone.
Thank you, everyone. And if you'd like to continue the conversation, as I said before, please feel free to contact me on Twitter or linkedin. Just look my name up Hannah Glass. Thank you very much.