Document of Significance

Document of Significance

Document of Significance

Document of Significance

Questions of Technology, Knowledge, Privacy and Efficiency  

Questions of Technology, Knowledge, Privacy and Efficiency  

Sundae Labs’ mission is inspired, in part, by a Socratic principle – that our well-being and our self-knowledge must depend on one another. “The examined life” is one of man’s greatest gifts. It helps a person improve himself by being conscious of himself, becoming aware of his patterns of thinking and acting, his identity, his way of conducting himself in the world. If he does this diligently, he can become aware of behaviours that thwart the advancement of his life, and the lives of his fellow human beings. 


Smart technology has not enriched this project. It made us numerous promises: to make life better, provide autonomy, give access to more knowledge, make better use of our time and energy, and allow us to fulfill our potential. But these promises are long since broken; technology has complicated man’s world without adding to his wisdom, and so has made him more dangerous to himself. It has magnified man’s impulses while reducing the forces of time, distance and difficulty that regulate him. It has made his thinking more dissociative and distractible, and made his sociability more narcissistic and less intimate. It multiplied possibilities so much that he cannot make choices soundly. It created so many competing stories that he cannot tell which are real. It created so many authorities that he does not know which to trust. To him, the world looks like a funhouse of refracted light and mirrors. It moves through him so rapidly that he works harder just to stay in one place.  So he feels burnout instead of progress, a sense of futility, and absurdity about the whole situation: all of this technological abundance could be helping him, if only he knew how to harness it. 


In this process, something else has happened. The idea of the human being is somehow smaller. He has been capable of great feats throughout his history. He built mind palaces of memory, and opened vast spaces in himself to explore and create. But as the smartphone roots itself, it is harder to separate the human being from his technological appendage. It becomes his default way of idling, socializing, and solving problems. When he is caught without his device, man feels severed from himself. This dependence is reflexive, and gives him a constant feeling of anxiety, of being powerless and naked on his own. Even when he is aware of its dangers he usually cannot bring himself to act on them; he does not feel that he has any alternative to choose. He lives in this double bind, and it makes him feel less like a human being and more like a restless beast. As the philosopher Martin Heidegger famously warned, technology is not fitting into the frame of the human being; he is being remade to fit into its frame, and is finding himself shrinking in the process.

The Problem of Data 

Meanwhile, the person gives himself away. Through apps, subscriptions, and various commercial activities, his data is extracted from him. All the details of his life – his health, habits, hobbies, moods, taste in food, art and sex – are given to parties that do not care for him, that use the data to nudge his behaviour and improve their products and strategies. He receives a service in return, but not the knowledge that could be gleaned from his data. If aggregated, this knowledge could be powerful, and frightening. His browser history can reveal the objects of his interest and desire; keystrokes and speech patterns can reveal his process of thinking; his movements can reveal his habits and vices; and his biometrics can indicate his heath and temperament. It is now possible to create models and profiles of a person with these trackable metrics, as though we were naturalists observing a foreign species. We can image his physical state, his mental state, and the patterns that make up his life and relationships. 


The Problem of Privacy 

These patterns are sensitive, and very revealing. If made available to other parties, they leave a person exposed to shame, theft, reductive profiling, and a myriad of other harms. This is the environment we find ourselves in: a landscape rife with data brokerage and breaches of privacy. Laws around the world have failed to keep up with this problem and provide adequate safeguards. The intuitive significance of privacy has not been properly translated into our ethical consciousness, much less into public policy. And so, the aforementioned double bind: people are both anxious about technology and utterly incapable of choosing against it. This is why privacy is a pivotal concern, and anything that Sundae Labs hopes to achieve must founded on it, not merely to protect a person from concrete harm, but from the psychological danger of being watched. Our digital worlds are now extensions of our home. They are our bedrooms, our bathrooms, our living rooms. The presence of interloping eyes affects our sense of solitude. It affects how honestly we are able to look at ourselves, how directly we are able to talk to ourselves. How do we ensure that our data cannot be breached by the outside observer, and give the person the space to breathe and see himself clearly? 


The Question of Knowledge 

If privacy was accounted for, what then? If a person’s data was made available to him, and only him, could he learn from it without the presence of prying eyes? This would not be the end of the danger. His raw data, if made available, could also be harmful to him. He could be overwhelmed by it and not know what to do, or be ashamed of what he sees and have his worst anxieties confirmed. He could make incorrect inferences and act on spurious assumptions. He could believe he knows more than he does, mistaking a digital reflection for the totality of himself. Any one of these reactions may lead him to self- destructive behaviour. So before endowing him with more tech, we must ask: does the human being have the wisdom to use such knowledge for good? How much information can he hold maturely? How should it be reflected back at him? What kind of knowledge should strive for, and what kind of knowledge could tear him away from himself? All “tech” innovators, if they have genuine interest in humanity, must ask these questions. Knowledge comes with consequences we don't understand, and we have many cautionary myths that warn against the reckless search for innovation.


In order to confront these questions, we must consider the nature of human data. How should we understand its meaning? Can we frame it in a philosophical way? Finding the right metaphor is often useful for taming technologies. For example, data has often been called the “new oil.” Consider this metaphor, and find some revealing features: a crude, flammable resource, something finite and possessable; hardly the best symbol to reflect the fruits of the human spirit. Now consider an alternative metaphor: sunshine. The human being is not just a source of energy, but also of intelligibility – he is a source of light, voice, vision and meaning. The sun is an ancient symbol in philosophy and religious life, a proxy for the limitless part of reality and humanity. It cannot be completely known, or completely tamed. But it gives off a radiance that, if cultivated in modest ways, can be used to light the darker corners of life. Mirrors can light up a room by diffusing a single source of sunlight, without extinguishing that light, or being equal to it.  


If human data is oil, then the human being can effectively be used up, owned by another party, or put to waste. If data is like sunshine, then each human being is an end in himself, and cannot be exhausted. The data he gives off can shed light back on himself without being confused for himself. A mirror does not know the feeling of the sun on its face, but it can still be made to reflect the light, so a person can see himself by the very light he casts. Instead of the funhouse, what if technology could become a reflective surface to harness the interpretive power of our data and direct it to those places that need our attention? What insight would become possible for us, as individuals and as a species, if we learned to direct our data in this way? 


The Question of Ease

It is the prerogative of most technology, especially mobility technology, to make our lives more efficient. We use it to bypass the inconveniences that take up our time, agitate us, and waste our energy. It also helps us avoid interactions that make life more intimidating and less smooth to the touch – interactions between friends, between dates, between buyers and sellers, teachers and students, etc. Our technology follows a “principle of least effort” as the linguist George Kingsley Zipf famously called it. It offers us a shorter distance between points in time and space.   


This efficiency, of course, feeds many of our problems. For everything it makes easier (purchases, payments, commutes, etc.) it makes the world more virtual by default, and less tangible. By using more of our data, it widens the exposure to privacy invasion. The infrastructure around us now presupposes the tech, and so becomes paradoxically less accessible the moment we are separated from our devices. This adds to our sense of dependence, and reinforces the double bind. To the degree that it does make our lives easier, the value of that ease is usually unquestioned. Surely there is nothing wrong with making day-to-day life more convenient, we might ask. We have limited time on this earth, after all. What could be wrong with jumping the line?

But as we must remember, shorter distances tend to bypass scenic routes. Efficiency is not an absolute value, but a relative one, and we need to the wisdom to apply it judiciously. In our eagerness to make life free of discomfort, we must make sure we are not shortchanging experiences that are meaningful for human beings. Every time we outsource a skill, we tend to lose it. Sometimes the trade-off seems innocent (do we truly need to remember phone numbers?) or what we gain seems worth the trade (online maps make us poor navigators but most trips are now safer). Sometimes the shortcuts can be offset with compensations; having our music curated by algorithms can reinforce our tastes, but these apps can also be designed to expose us to things outside our preferences.


But in other areas of life, the “ease” of tech can be existentially crippling. This is especially true of sociability and sexual companionship. A person is pickier abut prospective partners when he sees dozens of faces on a grid. He is more aware of possibilities, and less willing to commit to any one of them – any one person, or place, or career. Now that he can visualize so many paths, it takes longer to search for the right one. And yet it feels obvious that, in all these possibilities, a perfect match must somewhere out there. This is another kind of double bind, and many people get caught in it as they search for a perfect life. Meanwhile, if our tech helps bypass awkward first meetings or other social obstacles, it may remove experiences that help a person mature – learn how to get lost and find oneself again, how to navigate, or negotiate, approach another person, or have real conversations. We want to make a person’s time and energy more efficient, but not to reinforce his avoidance of necessary interactions, trials and errors that can strengthen his implicit learning and expose him to things he wouldn’t undertake by choice. 


So this is another question for Sundae Labs to consider: how can we understand the value – and the consequences – of removing obstacles from our lives, if some of those obstacles also conceal “experience points” necessary for our own development? What is the cost of eliminating novelty and discomfort from our lives? Technology must be regarded with suspicion to the degree that it makes us less human, to the degree it allows us to bypass those confrontations that are necessary for encountering life. We don’t want to create a tool that helps us avoid reality. We want a tool that helps us confront it.


What is Possible Now?

We live in an untenable situation: humanity is serving technology rather than the other way around. With the proliferation of AI, this problem is only going to get worse. But our world of tech is here to stay, and it must be innovated with a new kind of relationship, a new vision. The mission of Sundae Labs is to flip the script on these broken promises, and use our tech to serve our humanity. If our tech is going to help resolve the problems it created, it must prioritize the following values: 


  • Safety & Privacy. Use cutting edge hardware to prevent data breach and identity theft, safeguarding privacy above all else, and making sure your data is yours and only yours. 

  • Self-Knowledge. Help you navigate the overabundance of your own data, and help  find real, relevant and revealing patterns in your bio-psycho-social metrics, creating a personal mirror that only you can see.

  • Health & Stability. Use rigorous scientific measures to help make sense of your data, providing you with insights and credible recommendations to help chart and improve your physical and mental health. ​

  • Meaning & Relationships. Provide tools to help you find focus and vocation in your life, and create conditions to afford deeper connections with other human beings. ​


Protecting privacy is paramount. Designing hardware and software for absolute safety is the uncompromisable priority of Sundae Labs, the main mission that allows for everything else to be possible. If we design our tech this way, it will begin to free us from the double bind; hardware that is safe, that gives us more control, supports us, and makes us less dependent on itself. What if we had a technology that actively pointed away from the virtual world and back at real life? Security and privacy usually trade off, but it is imperative to have both. Sundae Labs will design a tool that helps you manage the excess of information around you, the excess of possibilities available at any given moment. It will help you detect real patterns in your data using sound science, help you picking up your states of mind, and identify trends in your life without being obtrusive or presumptuous. Ultimately, we need a tool that helps us navigate the world without becoming a crutch, something that can catch our sunshine and reflect it in a way that helps improve our lives and fulfill our potential as human beings. 

Sundae Labs’ mission is inspired, in part, by a Socratic principle – that our well-being and our self-knowledge must depend on one another. “The examined life” is one of man’s greatest gifts. It helps a person improve himself by being conscious of himself, becoming aware of his patterns of thinking and acting, his identity, his way of conducting himself in the world. If he does this diligently, he can become aware of behaviours that thwart the advancement of his life, and the lives of his fellow human beings. 


Smart technology has not enriched this project. It made us numerous promises: to make life better, provide autonomy, give access to more knowledge, make better use of our time and energy, and allow us to fulfill our potential. But these promises are long since broken; technology has complicated man’s world without adding to his wisdom, and so has made him more dangerous to himself. It has magnified man’s impulses while reducing the forces of time, distance and difficulty that regulate him. It has made his thinking more dissociative and distractible, and made his sociability more narcissistic and less intimate. It multiplied possibilities so much that he cannot make choices soundly. It created so many competing stories that he cannot tell which are real. It created so many authorities that he does not know which to trust. To him, the world looks like a funhouse of refracted light and mirrors. It moves through him so rapidly that he works harder just to stay in one place.  So he feels burnout instead of progress, a sense of futility, and absurdity about the whole situation: all of this technological abundance could be helping him, if only he knew how to harness it. 


In this process, something else has happened. The idea of the human being is somehow smaller. He has been capable of great feats throughout his history. He built mind palaces of memory, and opened vast spaces in himself to explore and create. But as the smartphone roots itself, it is harder to separate the human being from his technological appendage. It becomes his default way of idling, socializing, and solving problems. When he is caught without his device, man feels severed from himself. This dependence is reflexive, and gives him a constant feeling of anxiety, of being powerless and naked on his own. Even when he is aware of its dangers he usually cannot bring himself to act on them; he does not feel that he has any alternative to choose. He lives in this double bind, and it makes him feel less like a human being and more like a restless beast. As the philosopher Martin Heidegger famously warned, technology is not fitting into the frame of the human being; he is being remade to fit into its frame, and is finding himself shrinking in the process.

The Problem of Data 

Meanwhile, the person gives himself away. Through apps, subscriptions, and various commercial activities, his data is extracted from him. All the details of his life – his health, habits, hobbies, moods, taste in food, art and sex – are given to parties that do not care for him, that use the data to nudge his behaviour and improve their products and strategies. He receives a service in return, but not the knowledge that could be gleaned from his data. If aggregated, this knowledge could be powerful, and frightening. His browser history can reveal the objects of his interest and desire; keystrokes and speech patterns can reveal his process of thinking; his movements can reveal his habits and vices; and his biometrics can indicate his heath and temperament. It is now possible to create models and profiles of a person with these trackable metrics, as though we were naturalists observing a foreign species. We can image his physical state, his mental state, and the patterns that make up his life and relationships. 


The Problem of Privacy 

These patterns are sensitive, and very revealing. If made available to other parties, they leave a person exposed to shame, theft, reductive profiling, and a myriad of other harms. This is the environment we find ourselves in: a landscape rife with data brokerage and breaches of privacy. Laws around the world have failed to keep up with this problem and provide adequate safeguards. The intuitive significance of privacy has not been properly translated into our ethical consciousness, much less into public policy. And so, the aforementioned double bind: people are both anxious about technology and utterly incapable of choosing against it. This is why privacy is a pivotal concern, and anything that Sundae Labs hopes to achieve must founded on it, not merely to protect a person from concrete harm, but from the psychological danger of being watched. Our digital worlds are now extensions of our home. They are our bedrooms, our bathrooms, our living rooms. The presence of interloping eyes affects our sense of solitude. It affects how honestly we are able to look at ourselves, how directly we are able to talk to ourselves. How do we ensure that our data cannot be breached by the outside observer, and give the person the space to breathe and see himself clearly? 


The Question of Knowledge 

If privacy was accounted for, what then? If a person’s data was made available to him, and only him, could he learn from it without the presence of prying eyes? This would not be the end of the danger. His raw data, if made available, could also be harmful to him. He could be overwhelmed by it and not know what to do, or be ashamed of what he sees and have his worst anxieties confirmed. He could make incorrect inferences and act on spurious assumptions. He could believe he knows more than he does, mistaking a digital reflection for the totality of himself. Any one of these reactions may lead him to self- destructive behaviour. So before endowing him with more tech, we must ask: does the human being have the wisdom to use such knowledge for good? How much information can he hold maturely? How should it be reflected back at him? What kind of knowledge should strive for, and what kind of knowledge could tear him away from himself? All “tech” innovators, if they have genuine interest in humanity, must ask these questions. Knowledge comes with consequences we don't understand, and we have many cautionary myths that warn against the reckless search for innovation.


In order to confront these questions, we must consider the nature of human data. How should we understand its meaning? Can we frame it in a philosophical way? Finding the right metaphor is often useful for taming technologies. For example, data has often been called the “new oil.” Consider this metaphor, and find some revealing features: a crude, flammable resource, something finite and possessable; hardly the best symbol to reflect the fruits of the human spirit. Now consider an alternative metaphor: sunshine. The human being is not just a source of energy, but also of intelligibility – he is a source of light, voice, vision and meaning. The sun is an ancient symbol in philosophy and religious life, a proxy for the limitless part of reality and humanity. It cannot be completely known, or completely tamed. But it gives off a radiance that, if cultivated in modest ways, can be used to light the darker corners of life. Mirrors can light up a room by diffusing a single source of sunlight, without extinguishing that light, or being equal to it.  


If human data is oil, then the human being can effectively be used up, owned by another party, or put to waste. If data is like sunshine, then each human being is an end in himself, and cannot be exhausted. The data he gives off can shed light back on himself without being confused for himself. A mirror does not know the feeling of the sun on its face, but it can still be made to reflect the light, so a person can see himself by the very light he casts. Instead of the funhouse, what if technology could become a reflective surface to harness the interpretive power of our data and direct it to those places that need our attention? What insight would become possible for us, as individuals and as a species, if we learned to direct our data in this way? 


The Question of Ease

It is the prerogative of most technology, especially mobility technology, to make our lives more efficient. We use it to bypass the inconveniences that take up our time, agitate us, and waste our energy. It also helps us avoid interactions that make life more intimidating and less smooth to the touch – interactions between friends, between dates, between buyers and sellers, teachers and students, etc. Our technology follows a “principle of least effort” as the linguist George Kingsley Zipf famously called it. It offers us a shorter distance between points in time and space.   


This efficiency, of course, feeds many of our problems. For everything it makes easier (purchases, payments, commutes, etc.) it makes the world more virtual by default, and less tangible. By using more of our data, it widens the exposure to privacy invasion. The infrastructure around us now presupposes the tech, and so becomes paradoxically less accessible the moment we are separated from our devices. This adds to our sense of dependence, and reinforces the double bind. To the degree that it does make our lives easier, the value of that ease is usually unquestioned. Surely there is nothing wrong with making day-to-day life more convenient, we might ask. We have limited time on this earth, after all. What could be wrong with jumping the line?

But as we must remember, shorter distances tend to bypass scenic routes. Efficiency is not an absolute value, but a relative one, and we need to the wisdom to apply it judiciously. In our eagerness to make life free of discomfort, we must make sure we are not shortchanging experiences that are meaningful for human beings. Every time we outsource a skill, we tend to lose it. Sometimes the trade-off seems innocent (do we truly need to remember phone numbers?) or what we gain seems worth the trade (online maps make us poor navigators but most trips are now safer). Sometimes the shortcuts can be offset with compensations; having our music curated by algorithms can reinforce our tastes, but these apps can also be designed to expose us to things outside our preferences.


But in other areas of life, the “ease” of tech can be existentially crippling. This is especially true of sociability and sexual companionship. A person is pickier abut prospective partners when he sees dozens of faces on a grid. He is more aware of possibilities, and less willing to commit to any one of them – any one person, or place, or career. Now that he can visualize so many paths, it takes longer to search for the right one. And yet it feels obvious that, in all these possibilities, a perfect match must somewhere out there. This is another kind of double bind, and many people get caught in it as they search for a perfect life. Meanwhile, if our tech helps bypass awkward first meetings or other social obstacles, it may remove experiences that help a person mature – learn how to get lost and find oneself again, how to navigate, or negotiate, approach another person, or have real conversations. We want to make a person’s time and energy more efficient, but not to reinforce his avoidance of necessary interactions, trials and errors that can strengthen his implicit learning and expose him to things he wouldn’t undertake by choice. 


So this is another question for Sundae Labs to consider: how can we understand the value – and the consequences – of removing obstacles from our lives, if some of those obstacles also conceal “experience points” necessary for our own development? What is the cost of eliminating novelty and discomfort from our lives? Technology must be regarded with suspicion to the degree that it makes us less human, to the degree it allows us to bypass those confrontations that are necessary for encountering life. We don’t want to create a tool that helps us avoid reality. We want a tool that helps us confront it.


What is Possible Now?

We live in an untenable situation: humanity is serving technology rather than the other way around. With the proliferation of AI, this problem is only going to get worse. But our world of tech is here to stay, and it must be innovated with a new kind of relationship, a new vision. The mission of Sundae Labs is to flip the script on these broken promises, and use our tech to serve our humanity. If our tech is going to help resolve the problems it created, it must prioritize the following values: 


  • Safety & Privacy. Use cutting edge hardware to prevent data breach and identity theft, safeguarding privacy above all else, and making sure your data is yours and only yours. 

  • Self-Knowledge. Help you navigate the overabundance of your own data, and help  find real, relevant and revealing patterns in your bio-psycho-social metrics, creating a personal mirror that only you can see.

  • Health & Stability. Use rigorous scientific measures to help make sense of your data, providing you with insights and credible recommendations to help chart and improve your physical and mental health. ​

  • Meaning & Relationships. Provide tools to help you find focus and vocation in your life, and create conditions to afford deeper connections with other human beings. ​


Protecting privacy is paramount. Designing hardware and software for absolute safety is the uncompromisable priority of Sundae Labs, the main mission that allows for everything else to be possible. If we design our tech this way, it will begin to free us from the double bind; hardware that is safe, that gives us more control, supports us, and makes us less dependent on itself. What if we had a technology that actively pointed away from the virtual world and back at real life? Security and privacy usually trade off, but it is imperative to have both. Sundae Labs will design a tool that helps you manage the excess of information around you, the excess of possibilities available at any given moment. It will help you detect real patterns in your data using sound science, help you picking up your states of mind, and identify trends in your life without being obtrusive or presumptuous. Ultimately, we need a tool that helps us navigate the world without becoming a crutch, something that can catch our sunshine and reflect it in a way that helps improve our lives and fulfill our potential as human beings. 

Sundae Labs’ mission is inspired, in part, by a Socratic principle – that our well-being and our self-knowledge must depend on one another. “The examined life” is one of man’s greatest gifts. It helps a person improve himself by being conscious of himself, becoming aware of his patterns of thinking and acting, his identity, his way of conducting himself in the world. If he does this diligently, he can become aware of behaviours that thwart the advancement of his life, and the lives of his fellow human beings. 


Smart technology has not enriched this project. It made us numerous promises: to make life better, provide autonomy, give access to more knowledge, make better use of our time and energy, and allow us to fulfill our potential. But these promises are long since broken; technology has complicated man’s world without adding to his wisdom, and so has made him more dangerous to himself. It has magnified man’s impulses while reducing the forces of time, distance and difficulty that regulate him. It has made his thinking more dissociative and distractible, and made his sociability more narcissistic and less intimate. It multiplied possibilities so much that he cannot make choices soundly. It created so many competing stories that he cannot tell which are real. It created so many authorities that he does not know which to trust. To him, the world looks like a funhouse of refracted light and mirrors. It moves through him so rapidly that he works harder just to stay in one place.  So he feels burnout instead of progress, a sense of futility, and absurdity about the whole situation: all of this technological abundance could be helping him, if only he knew how to harness it. 


In this process, something else has happened. The idea of the human being is somehow smaller. He has been capable of great feats throughout his history. He built mind palaces of memory, and opened vast spaces in himself to explore and create. But as the smartphone roots itself, it is harder to separate the human being from his technological appendage. It becomes his default way of idling, socializing, and solving problems. When he is caught without his device, man feels severed from himself. This dependence is reflexive, and gives him a constant feeling of anxiety, of being powerless and naked on his own. Even when he is aware of its dangers he usually cannot bring himself to act on them; he does not feel that he has any alternative to choose. He lives in this double bind, and it makes him feel less like a human being and more like a restless beast. As the philosopher Martin Heidegger famously warned, technology is not fitting into the frame of the human being; he is being remade to fit into its frame, and is finding himself shrinking in the process.

The Problem of Data 

Meanwhile, the person gives himself away. Through apps, subscriptions, and various commercial activities, his data is extracted from him. All the details of his life – his health, habits, hobbies, moods, taste in food, art and sex – are given to parties that do not care for him, that use the data to nudge his behaviour and improve their products and strategies. He receives a service in return, but not the knowledge that could be gleaned from his data. If aggregated, this knowledge could be powerful, and frightening. His browser history can reveal the objects of his interest and desire; keystrokes and speech patterns can reveal his process of thinking; his movements can reveal his habits and vices; and his biometrics can indicate his heath and temperament. It is now possible to create models and profiles of a person with these trackable metrics, as though we were naturalists observing a foreign species. We can image his physical state, his mental state, and the patterns that make up his life and relationships. 


The Problem of Privacy 

These patterns are sensitive, and very revealing. If made available to other parties, they leave a person exposed to shame, theft, reductive profiling, and a myriad of other harms. This is the environment we find ourselves in: a landscape rife with data brokerage and breaches of privacy. Laws around the world have failed to keep up with this problem and provide adequate safeguards. The intuitive significance of privacy has not been properly translated into our ethical consciousness, much less into public policy. And so, the aforementioned double bind: people are both anxious about technology and utterly incapable of choosing against it. This is why privacy is a pivotal concern, and anything that Sundae Labs hopes to achieve must founded on it, not merely to protect a person from concrete harm, but from the psychological danger of being watched. Our digital worlds are now extensions of our home. They are our bedrooms, our bathrooms, our living rooms. The presence of interloping eyes affects our sense of solitude. It affects how honestly we are able to look at ourselves, how directly we are able to talk to ourselves. How do we ensure that our data cannot be breached by the outside observer, and give the person the space to breathe and see himself clearly? 


The Question of Knowledge 

If privacy was accounted for, what then? If a person’s data was made available to him, and only him, could he learn from it without the presence of prying eyes? This would not be the end of the danger. His raw data, if made available, could also be harmful to him. He could be overwhelmed by it and not know what to do, or be ashamed of what he sees and have his worst anxieties confirmed. He could make incorrect inferences and act on spurious assumptions. He could believe he knows more than he does, mistaking a digital reflection for the totality of himself. Any one of these reactions may lead him to self- destructive behaviour. So before endowing him with more tech, we must ask: does the human being have the wisdom to use such knowledge for good? How much information can he hold maturely? How should it be reflected back at him? What kind of knowledge should strive for, and what kind of knowledge could tear him away from himself? All “tech” innovators, if they have genuine interest in humanity, must ask these questions. Knowledge comes with consequences we don't understand, and we have many cautionary myths that warn against the reckless search for innovation.


In order to confront these questions, we must consider the nature of human data. How should we understand its meaning? Can we frame it in a philosophical way? Finding the right metaphor is often useful for taming technologies. For example, data has often been called the “new oil.” Consider this metaphor, and find some revealing features: a crude, flammable resource, something finite and possessable; hardly the best symbol to reflect the fruits of the human spirit. Now consider an alternative metaphor: sunshine. The human being is not just a source of energy, but also of intelligibility – he is a source of light, voice, vision and meaning. The sun is an ancient symbol in philosophy and religious life, a proxy for the limitless part of reality and humanity. It cannot be completely known, or completely tamed. But it gives off a radiance that, if cultivated in modest ways, can be used to light the darker corners of life. Mirrors can light up a room by diffusing a single source of sunlight, without extinguishing that light, or being equal to it.  


If human data is oil, then the human being can effectively be used up, owned by another party, or put to waste. If data is like sunshine, then each human being is an end in himself, and cannot be exhausted. The data he gives off can shed light back on himself without being confused for himself. A mirror does not know the feeling of the sun on its face, but it can still be made to reflect the light, so a person can see himself by the very light he casts. Instead of the funhouse, what if technology could become a reflective surface to harness the interpretive power of our data and direct it to those places that need our attention? What insight would become possible for us, as individuals and as a species, if we learned to direct our data in this way? 


The Question of Ease

It is the prerogative of most technology, especially mobility technology, to make our lives more efficient. We use it to bypass the inconveniences that take up our time, agitate us, and waste our energy. It also helps us avoid interactions that make life more intimidating and less smooth to the touch – interactions between friends, between dates, between buyers and sellers, teachers and students, etc. Our technology follows a “principle of least effort” as the linguist George Kingsley Zipf famously called it. It offers us a shorter distance between points in time and space.   


This efficiency, of course, feeds many of our problems. For everything it makes easier (purchases, payments, commutes, etc.) it makes the world more virtual by default, and less tangible. By using more of our data, it widens the exposure to privacy invasion. The infrastructure around us now presupposes the tech, and so becomes paradoxically less accessible the moment we are separated from our devices. This adds to our sense of dependence, and reinforces the double bind. To the degree that it does make our lives easier, the value of that ease is usually unquestioned. Surely there is nothing wrong with making day-to-day life more convenient, we might ask. We have limited time on this earth, after all. What could be wrong with jumping the line?

But as we must remember, shorter distances tend to bypass scenic routes. Efficiency is not an absolute value, but a relative one, and we need to the wisdom to apply it judiciously. In our eagerness to make life free of discomfort, we must make sure we are not shortchanging experiences that are meaningful for human beings. Every time we outsource a skill, we tend to lose it. Sometimes the trade-off seems innocent (do we truly need to remember phone numbers?) or what we gain seems worth the trade (online maps make us poor navigators but most trips are now safer). Sometimes the shortcuts can be offset with compensations; having our music curated by algorithms can reinforce our tastes, but these apps can also be designed to expose us to things outside our preferences.


But in other areas of life, the “ease” of tech can be existentially crippling. This is especially true of sociability and sexual companionship. A person is pickier abut prospective partners when he sees dozens of faces on a grid. He is more aware of possibilities, and less willing to commit to any one of them – any one person, or place, or career. Now that he can visualize so many paths, it takes longer to search for the right one. And yet it feels obvious that, in all these possibilities, a perfect match must somewhere out there. This is another kind of double bind, and many people get caught in it as they search for a perfect life. Meanwhile, if our tech helps bypass awkward first meetings or other social obstacles, it may remove experiences that help a person mature – learn how to get lost and find oneself again, how to navigate, or negotiate, approach another person, or have real conversations. We want to make a person’s time and energy more efficient, but not to reinforce his avoidance of necessary interactions, trials and errors that can strengthen his implicit learning and expose him to things he wouldn’t undertake by choice. 


So this is another question for Sundae Labs to consider: how can we understand the value – and the consequences – of removing obstacles from our lives, if some of those obstacles also conceal “experience points” necessary for our own development? What is the cost of eliminating novelty and discomfort from our lives? Technology must be regarded with suspicion to the degree that it makes us less human, to the degree it allows us to bypass those confrontations that are necessary for encountering life. We don’t want to create a tool that helps us avoid reality. We want a tool that helps us confront it.


What is Possible Now?

We live in an untenable situation: humanity is serving technology rather than the other way around. With the proliferation of AI, this problem is only going to get worse. But our world of tech is here to stay, and it must be innovated with a new kind of relationship, a new vision. The mission of Sundae Labs is to flip the script on these broken promises, and use our tech to serve our humanity. If our tech is going to help resolve the problems it created, it must prioritize the following values: 


  • Safety & Privacy. Use cutting edge hardware to prevent data breach and identity theft, safeguarding privacy above all else, and making sure your data is yours and only yours. 

  • Self-Knowledge. Help you navigate the overabundance of your own data, and help  find real, relevant and revealing patterns in your bio-psycho-social metrics, creating a personal mirror that only you can see.

  • Health & Stability. Use rigorous scientific measures to help make sense of your data, providing you with insights and credible recommendations to help chart and improve your physical and mental health. ​

  • Meaning & Relationships. Provide tools to help you find focus and vocation in your life, and create conditions to afford deeper connections with other human beings. ​


Protecting privacy is paramount. Designing hardware and software for absolute safety is the uncompromisable priority of Sundae Labs, the main mission that allows for everything else to be possible. If we design our tech this way, it will begin to free us from the double bind; hardware that is safe, that gives us more control, supports us, and makes us less dependent on itself. What if we had a technology that actively pointed away from the virtual world and back at real life? Security and privacy usually trade off, but it is imperative to have both. Sundae Labs will design a tool that helps you manage the excess of information around you, the excess of possibilities available at any given moment. It will help you detect real patterns in your data using sound science, help you picking up your states of mind, and identify trends in your life without being obtrusive or presumptuous. Ultimately, we need a tool that helps us navigate the world without becoming a crutch, something that can catch our sunshine and reflect it in a way that helps improve our lives and fulfill our potential as human beings.