FAQ

Cecil Baird , ELONTech Advisor

1)  Do disinformation and digital media constitute a challenge to democracy?

Having different opinions is healthy for public debate and democracy. However targeting individuals with alternative facts, fake news, post-truths and lies to influence how they think, decide, and will vote, is very dangerous for our democracy.

The impact of disinformation and digital media to our democracy has already had an impact from the UK leaving the European Union to the presidential elections in the USA. You could say that we are living in an information war, best noticed when we look out from our screens and take a  birds eye view of the political activity in the West.

2)  Can we hold algorithmic decision-making systems accountable?

In 2019 holding an an algorithm accountable is holding nobody accountable. Algorithms are created by humans, who have written the codes and values, and they should be held accountable. We should also expect transparency of the values programmed & their outcomes.

3) Is it possible or desirable to build moral principles into AI systems?

Good morals will enable human and robots  to work cooperatively in groups. We want to teach and train AI, the difference between good and bad values, the principles of right and wrong behaviour. However, before giving AI systems a sense of morality, engineers and experts in this field would have to first define moral values, which is hurdle number 1, then teach it in a way that Artificial Intelligence Systems can process: through objective metrics, which is not how morality as we know it works.

4)  What do you consider the next 5 years challenge /trend for the research  and  innovation in your field of expertise?

The internet is highlighting a new series of pressing problems we need to solve. Blockchain, distributed ledger technology, can provide the technology to solve some of these problems.

I expect to see more startups built on decentralised data storages rather than a centralised corporate controlled data service model, such as Facebook and Google, which is susceptible to corruption and enables mass surveillance.

Self-sovereign identities for both everyday consumers to create seamless secure online services for commerce as well as for government services. Hopefully refugees and the homeless will benefit from these advancements too.  More applications that use blockchain to prove rights or ownership and time-stamping that are cryptographically secure. Also progress with stable coins and Security Token Offerings, which will also be an exciting space to look out for.

5) You worked and lived both in EU and US. In your opinion, what is the difference between the european and american societies in accepting innovation and embracing the so-called “disruption” at all dimensions and fields of social life?  ANt blockchain for good – project that you discovered lately?

Some context, I am a European (Franco-British) raised in London and educated within the French school system. I recently moved to California to expand our company and raise our family.

Not only is everything bigger in the States from the food portions to the cars they drive, but it seems to transcend into a “Think Big” attitude aided by this “Can Do” American spirit and much less of a social stigma of failing. Think Big, Can Do, less fear of Failure and more investment money available for startups creates a very welcoming stage for innovation. I also find there is some terrific customer service processes which is crucial for growing business’.

Europe has a long history, with traditions and architecture that needs to be protected, observed and respected, it is a nostalgic place. Disruption challenges traditions and there is something ‘sad’ & upsetting about that. In general, the Europeans are more careful in their decision making compared to the States. And unfortunately although Europe is a large economy, the many languages causes fragmentations, making it less attractive to VCs. But Europe manufactures world class goods, products & produce and is immensely creative. And although regulation is stricter than in America I do think that Europe do put overall citizens rights and environment first over business’ & profit, and that is a good thing.

 

Guenther Dobrauz, ELONTech Advisor

 1) Do disinformation and digital media constitute a challenge to democracy?

Democracy has at all times been a challenge and under attack but it is our duty to rise to that very challenge and to defend it at all cost. At the heart of democracy is transparency which creates accountability and ultimately trust. Digital media has the potential to create both – access to information and transparency or indeed disinformation and clouding of the truth. As the early digital  generations being those who either were brought up in an analogue world but learned to live with the new digital reality and indeed those who are true digital natives we all have to accept that this world has developed faster than our understanding of it and as such it is of paramount importance to continue to shape education around it as well as to align established rules and regulations with the altered reality.  

2) Can we hold algorithmic decision-making systems accountable?

I not only think we can I indeed believe we must. But to do so we must start with those in charge of creating and operating them, which means that we must first of all ensure transparency and access to codes and secondly get the creators to accept legal and ethical responsibility. It is probably not too far fetched to think that we are back to yet another modern Prometheus. We must be mindful of the positive and the negative potential of this new fire.

3) Is it possible or desirable to build moral principles into AI systems?

Again my answer would be a strong yes on both accounts. Now that is easier said than done as this will immediately trigger the question of which moral principles. But then again I believe that generally accepted moral principles can be agreed and implemented to provide systems with at least a basic moral compass but in any case the human/machine interface will be a critical debate.

4) What do you consider the next 5 years challenge /trend for the research
and  innovation in your field of expertise?

My focus is on regulation, LegalTech and the transformation of the legal world, innovation and exponential technologies. In regulation the key challenge will be to align the existing frameworks which have been created over the course of decades if not centres but against a completely different background and based on different assumptions with the altered reality (and needs) of today and today’s (and tomorrow’s) technological possibilities. We must also critically rethink the function of institutions and how to transpose them and their valuable core safely into the future. The legal world itself will face unprecedented change and partial disruption. Again we should see the opportunity and not the threat and at the same time must be mindful how we will preserve the role of lawyers as stewards of the law and how we use the potential of technology to ensure and extend access to justice. Finally the dynamics of innovation are themselves rapidly changing in an exponential age which is a challenge  since we are by and large local and linear thinkers. We must find ways to rise to this challenge.

5) You launched the Disruption Disciples a few months ago. What is the main objective of
this initiative ?

The two big  passions of my life have always been the law and innovation. When it comes to the latter I have soon realised that to move forward one needs traction and friction which is best achieved when different materials come together or that indeed true innovation usually a, originates from outside of established industries and b, when people with different backgrounds and different expertise come together. It is our privilege to live in the day and age of exponential technologies and true exponential value will be created at their intersection as different fields are converging at an increasingly rapid pace. At the same time, there are social, economic and ecological challenges that can only be met by mastering complexity and collaboration. Technological progress enables us to do this on a hitherto unimagined level. But we are confronted with many barriers that limit us in achieving our potential: lack of knowledge, creativity and collaboration. To reverse this, we need to reorient and inspire the synapses of the system. We are here to do exactly this: year down the barriers and accelerate change. Disruption Disciples is a global movement that promotes the critical exchange of ideas and knowledge and fosters collaboration. We strive to ignite a new dynamic to advance civilisation in large steps. Our circle includes those who give priority to change over optimization. Every building block that guarantees this disruption of the status quo is valuable irrespective of background or origin. We create an intellectual barrier-free environment and a counter-movement to the echo chambers of our times. Disruption is state of mind. So in essence my goal with Disruption Disciples was to accelerate the process by bringing together people by bridging one degree of seperation. First locally by creating local chapters and then globally by connecting these and their members. I was truly amazed by the uptake which has seen more than 50 chapters around the world come to life in just a few weeks and people connecting and exchanging across the globe. In essence Disruption Disciples is a 21st century version of the saloon of the Enlightenment.  
 

 

Lio Dricot, ELONTech Advisor (UCL, University of Leuven The New (Louvain-la-Neuve))

1) Do disinformation and digital media constitute a challenge to democracy?

To answer the question, we would have to define democracy first. We have an intuitive notion but, when we study it closely, we realise that the word “democracy” in itself is a huge challenge. But even with a proper definition, we would need to define “disinformation”.

What we witness is that digital media are challenging the notion of “truth” that was previously held by a few centralised medias. What was printed in the newspaper or seen on the television was the “truth”, nobody questioned that. But this phenomenon is quite recent as, before the French revolution, “truth” was mostly a monopoly of religions. Religions were a by product of the writing technology. Religions managed to write down things and people didn’t questioned it. Religions were essential to keep an aristocratic society. If it was written, it was the truth. Thanks to the printing press, writing became more common. Medias became the truth. Attending religious ceremony was replaced by the news on television. Our sacred duty switched from “obey god’s commandements” to “be informed, listen to the news”. Aristocracy was replaced by something we call “democracy” but which is mainly the power of a few people that we believe to have chosen through elections.

But, after the printing press, we are currently witnessing the second major evolution of the technology known as “writing” : Internet. We are in the middle of the transition so it is very hard to predict what will happen but there’s one thing we can be sure.

1. Traditional medias will be tomorrow what religion is today. Something still existing out of habit but obsolete and decaying.
2. Democratic power will be tomorrow what aristocracy is today. Something people don’t really care about anymore.

2) Can we hold algorithmic decision-making systems accountable?
The main problem is that most modern algorithms are evolving and learning. That’s why we often talk about “IA”. This means that, in its simplest case, there are at least four elements involved in any decision taken by an algorithm:

1. The algorithm itself, as written by the programmer.
2. The learning data, fed by the software producer to “initialise” the algorithm.
3. The data encountered by the algorithm during its normal use.
4. The input by the user.

If an algorithm takes a really bad decision, it is nearly impossible to know “why” it took that decision. So who will we make accountable?

Maybe, in the future, algorithms will have their own “life”. They will be considered as legal entity and we could judge them. An algorithm could be deleted forever. We may start to think about algorithms like we think about animals : entity that can help us but that could harm us.

3) Is it possible or desirable to build moral principles into AI systems?

In facts, to some degree, a coder already build its own moral system into any software he writes. That’s not something new. When you write geotracking software for Google, you know perfectly that it will be used to display targeted advertising.

But, in the long term, AI systems will build their own morals.

Just like we realised that we could not write a software that can drive a car but we could write a software that learn how to drive a car, we will realise that we can’t write a moral software but we can build software that learn about morality. The problem is that there’s a huge difference between the moral principles we believe are true and the moral principles we apply in everyday life. Software are not learning what we preach but what we really do. That’s probably why we are scared of unethical software: it’s simply our own reflection in the mirror.

4) What do you consider the next 5 years challenge /trend for the research and  innovation in your field of expertise?

The social choice theory demonstrated, decades ago, that there’s no election systems which could always elect the “true choice” of the voters. The biggest challenge in that area will be to think outside the box and build system which are not “election systems” anymore. After all, why should we elect someone to take a decision in our name while we could take that decision in real time? Or we could ensure that only the people affected by the decision have a voice. The simplest example is: why should men be able to decide the laws about abortion?

Thinking outside the box will require a true interdisciplinary cooperation. Technologies like blockchains and IA allow to think in different ways about sociology and politics. But most engineers don’t know a dime about sociology or anthropology. Why sociologists have few gateways to understand why those technologies may change the society. Maybe we need a new kind of scientists that I call “techno-sociologists”.

I like to illustrate this with an anecdote: as far as I know, nobody truly understand how Bitcoin works, from a techno-sociological point of view.

5) What are the latest updates on “Liquid Democracy”? How would you comment the impact of disruption of institutions in general, up to this point?

Liquid democracy is not a technology nor a clear idea. Is more a broad concept like the universal basic income. Like basic income, liquid democracy is still an idea in its infancy. In the short term, nothing will change, no disruption will happen. People will try democracy.earth or work with colony.io and say “it’s cool but it’s not a revolution”.

Then, one morning, we will realise that we don’t really care about institutions. Like religion and aristocracy, those institutions will still be there. They will still have a niche market. We will not really care about them anymore.