Your brain in itself is a big data processing machine.
I admit that this analogy is far from being the most glamorous way to imagine the state of one’s brain, but it is one way to think about intelligence.
As human beings, we naturally absorb data. In fact, we have millions of different sensors all over our body, starting from our 5 basic senses.
Your brain synthesises the data then builds a model of the world around you. It does so on gigabytes of data, in real-time.
Subsequently, based on the data your brain collects- you make decisions.
You decide your next course of action by studying this model of the world, against your objectives and priorities.
The process of collecting data, analysing it and making decisions is a core part of what makes us human.
I carried similar thoughts during my Ph.D., when I studied ‘ How model and movement work in mammals’ — from the moment the light hits our eyes, to the 3D picture of the world we view in our minds.
The truth is that today our planet is covered with billions of Internet-enabled devices, collecting data in real-time. Let’s take a look at a few examples: websites, credit cards, GPS, Social Media, satellites, AVs, and even drones, it is similar to us having a million of each facial feature, active at the same time.
In theory, we should all be super aware of what is happening globally, but in reality, most of our information is gained via third parties, from experts, news articles, opinion pieces, Wikipedia entries. (And unfortunately, you can always find an expert to support any opinion you want!)
Somehow, the world is filled with more data, but our brains are still unable to turn the data into a better model of the world.
So, how do we fix this?
What exactly is a bicycle for data?
Steve Jobs famously described the computer as a ‘bicycle for the mind’.
The idea is that just as a bicycle amplifies the power of our muscles, a computer amplifies the power of our brains.
Writing code is an early example of this kind of brain amplification through technology. In the past, people used computers by directly inputting machine instructions on punch cards. With code, programmers could express the program as something that was closer to a story. This was a radical notion in the 1950s!
Fast forward to 2020, and we generally think of coding as a highly technical skill. We’ve seen countless revolutions in what is now called “human-computer interaction”, and our tools have evolved to become extensions of ourselves.
As a matter of fact, we have reached a point in technology where we can rent someone’s house by simply clicking a few links on our smartphones — but with data, the advancement hasn’t been the same.
We have ‘millions’ of active senses, but we still need coding to turn data into a model of what is happening.
Now, imagine a world where we could synthesise data, as effortlessly as using a smartphone, without writing any code?
Every technology needs an interface; an interface is the medium through which we communicate with computers and get feedback — similar to the word document, spreadsheet or message thread.
Existing data science vendors would like you to believe that the interface for Data Science is a workflow builder, or a BI tool, a dashboard or SQL.
But for data scientists their choice is the notebook.
Why notebooks? Data science is about telling a story — it’s the story your data tells you about the world. And notebooks enable you to tell this story, in an interactive document that combines code, visualisations, and text.
And the best thing about notebooks is that they democratise. Anyone can write a notebook. You don’t have to ask permission.
But as awesome as the tool is, notebooks are still fundamentally about coding.
That’s why we’ve built Vayu, the no-code notebook. It’s a ‘bicycle’ for Data Science, a motor and diesel designed to be as simple as a word document.
- Interactive documents, to tell the story of your data.
- The full power of SQL, R, and python without coding.
- The instant feedback, powered by the latest hardware in our cloud (including GPUs).
- Seamlessly connect to your data from any source (files, databases, external apps, APIs).
- Host your data for anyone to use.
- Collaborate in real-time and share your notebooks.
- In-built connections to external services, for NLP, ML, and AI.
- Scale-up from GBs to PBs, without managing any infrastructure.
- Turn your notebooks into templates for other people to use.
- Automatic recommendations based on usage inside and outside your organisation.
With no-code Data Science, we are hoping that it becomes normal for all of us to base our decisions on our world’s real data. Whether you are running a business, a non-profit or if you simply want to make sense of a political issue — you, and anyone you work with, can have effortless access to an accurate model of the things you care about.
If you would like to be a part of this, you can join our BETA
By David KELL