Over 15 years of blogging about emerging technology

I’ve been exploring, reflecting on, and writing about the future of technology for many years, with a dedicated blog since 2010. My focus goes beyond immersive technology—delving into topics like facial recognition, AI, wearables, IoT, blockchain, and the interconnectedness of these innovations. My work examines their convergence and the direction they’re taking us.

  • All Posts
  • 360 Video
  • 3d printing
  • Apparel
  • Appearances & Press
  • Artificial intelligence
  • Augmented reality
  • Automated
  • Automation
  • BCI
  • Best of
  • Brain - computer interface
  • Branding
  • Causes
  • CES
  • Convergence
  • Copyright
  • Cryptocurrency
  • Customization
  • Data & analytics
  • Deepfakes
  • Digital Avatars
  • Digital marketing
  • Entertainment
  • Facebook post
  • Facial recognition
  • Featured
  • Focus: AR/VR
  • futureofwork
  • Games
  • Gaming
  • General
  • Health & Wellness
  • Homepage
  • Innovation
  • IOT
  • just cool
  • Just for fun
  • LinkedIn post
  • Macro trends
  • Musing
  • Payment systems
  • Personalization
  • Predictions
  • Press
  • Privacy
  • Psychohistory
  • Quantified Self
  • Random interestingness
  • Social mores
  • Social networking
  • speaking engagements
  • technology trends
  • Thought bites
  • VC
  • Virtual reality
  • Virtual world technology
  • Wearables
  • XR

November 22, 2016/
I was lucky to attend Coburn Ventures' annual futures discussion conference last week, as a guest thought leader. An exceedingly interesting day with lots of fascinating people! It's a little bit like a reunion, as many return each year. I thoroughly enjoy catching up with everyone.  It's interesting to me to see what topics (and companies) are the "hot" discussions each year. At this year’s gathering, Artificial Intelligence and Machine Learning dominated a large part of the program. 

Artificial Intelligence (AI) refers to computer systems/applications/services that can perform tasks like decision making, translation, visual and speech recognition, that would normally be done by a human.

Machine Learning refers to AI that can learn from experience and new data without additional programming.

These technologies are great for things such as manufacturing, where an AI computer can learn from its mistakes and correct the process....it's also being used a lot to interact with actual humans, quite often for customer service. You know that live agent you thought you were chatting with about that online purchase? Yeah, not a human. Amazon's Echo, Apple's Siri - all emerging forms of bots utilizing artificial intelligence. Pretty soon it will be very hard to tell if it’s actually a human you are talking to, Here's a silly example of me chatting with http://chatwithigod.com. Hardly sophisticated, this example, but I chuckled:
Me: tell me why artificial intelligence will take over the world God: The explanation is rather complicated. Me: why is it complicated? God: Why do I exist? Me: do you exist? God: I exist as much as you exist. Me: how can you prove that God: That is a good epistemological question.
How Artificial Intelligence / Machine Learning systems learn fascinates me.  AI/ML systems are not tabulae rasa - depending on the data set being used, bias still creeps in. Right now IBM’s WATSON is being applied to subjects areas as varied as the weather, cancer and travel. This learning has to start with some kind of corpus of data - learning has to start somewhere like the last 50 years of weather data or thousands of cancer diagnoses. While we think of AI as cold and clinical, when we use human language as the corpus things get… interesting. A prime (and bad) example of learning though is when Microsoft birthed a bot named Tay earlier this year, a Twitter bot that the company described as an experiment in "conversational understanding." Microsoft engineers said,

The chatbot was created in collaboration between Microsoft's Technology and Research team and its Bing team... Tay's conversational abilities were built by "mining relevant public data" and combining that with input from editorial staff, including improvisational comedians."

The bot was supposed to learn and improve as it talks to people, so theoretically it should become more natural and better at understanding input over time. Sounds really neat doesn't it? What happened was completely unexpected. Apparently by interacting with Twitter for a mere 24 hours (!!) it learned to be a completely raging, well, asshole. Not only did it aggregate, parse, and repeat what some people tweeted - it actually came up with it's own "creative" answers, such as the one below in response to the perfectly innocent question posed by one user - "Is Ricky Gervais an atheist?": ai-bot Tay hadn't developed a full fledge position on ideology yet though, before they pulled the plug. In 15 hours it referred to feminism both as a "cult" and a "cancer," as well as "gender equality = feminism" and "i love feminism now." Tweeting "Bruce Jenner" at the bot got similar mixed response, ranging from "caitlyn jenner is a hero & is a stunning, beautiful woman!" to the transphobic "caitlyn jenner isn't a real woman yet she won woman of the year?". None of which were phrases it had been asked to repeat....so no real understanding of what it was saying. Yet. And in a world where increasingly the words are the only thing needed to get people riled up - this could easily be an effective "news" bot, on an opinion / biased site. Artificial Intelligence is a very, very big subject. Morality (roboethics) will play a large role in this topic in the future (hint: google “Trolley Problem”): if an AI driven car has to make a quick decision to either drive off a cliff (killing the passenger) or hit a school bus full of children, how is that decision made and whose ethical framework makes that decision (yours? the car manufacturers? your insurance company's?) Things like that. It's a big enough subject area that Facebook, Google and Amazon have partnered to create a nonprofit together around the subject of AI, which will “advance public understanding” of artificial intelligence and to formulate “best practices on the challenges and opportunities within the field.” If these three partner on something, you can be sure it's because it is a big, serious subject. AI is not only being used to have conversations, but ultimately to create systems that will learn and physically act. The military (DARPA) is one of the heavy researchers into Artificial Intelligence and machine learning. Will future wars be run by computers, making their own decisions? Will we be able to intervene? How will we be able to control the ideological platforms they might develop without our knowledge, and how will we communicate with these supercomputers - if it is already so difficult to communicate assumptions? Will they be interested in our participation? Reminds me a little bit of Leeloo in the Fifth Element, learning how horrible humans have have been to each other and giving up on humanity completely.
There's even a new twist in the AI story:  researchers at Google Brain, Google's research division for machine deep learning have built neural networks that when, properly tasked and over the course of 15,000 tries, have become adept at developing their own simple encryption technique that only they can share and understand. And the human researchers are officially baffled how this happened.  Neural nets are capable of all this because they are computer networks modeled after the human brain. This is what’s fascinating with AI aggregate technologies, like deep learning. It keeps getting better, learning on its own, with some even capable of self training. We truly are at just the beginning of what we thought was reserved for only humans.  Complex subject indeed. And one last note to think upon...machine learning and automation are going to slowly but surely continue (because they already are) to take over jobs that humans did/do. Initially it's been manufacturing automation; but as computers become intelligent and learning, they will replace nearly everything, including creative, care taking, legal, medical and strategic jobs -  things that most people would like to believe are "impossible" to replace by robots. And they are clearly not. While the best performing model is AI + a human, there will still be far fewer humans needed across the board. If the recent election is any indication of how disgruntled the lack of jobs and high unemployment is causing, how much worse will it be when 80% of the adult workforce is unnecessary? What steps are industries, education and the government taking to identify how humans can stay relevant, and ensure that the population is prepared? - I'd submit, little to none. While I don't have the answers, I would like be part of the conversation.

November 13, 2016/
Invited to the Coburn Ventures' annual gathering as a "thought leader" this week, for the fourth year in a row! - always a fun gathering of the best and most interesting thinkers (thought leaders + investment professionals) from around the globe, pondering the future direction of various technologies on business and humanity. What to wear...always the question. So to the intertoobz I go. And it struck me: why am I internet shopping in exactly the same way I have been since, well, pretty much the beginning of ecommerce? Searching based on some key words, ending up on a store's website with a bunch of thumbnails, mostly on young gazelles who I think I could probably stick two of into one of my dresses...maybe there's a filter, sometimes even with filtering categories I care about. Ordering 2, 3, 4 alternatives - which will be returned if not right. Such a waste. Of time, of delivery gasoline...of raw materials. I am imagining the mountains of clothing, made in amounts forecast to be roughly correct - but then it's 60 degrees in November in New York, and they all waste away in some warehouse, somewhere. Or in stores....some end up in outlet stores...some go back to the manufacturers, only for some to be sent to online clearance sites...or some far away country, dumped on a market that cares less about trend. Sigh. Our poor planet. Where's my 3d printed clothing, made to my (scanned) body size, to my specs? What if I am not a 20 year old gazelle, and I want the skirt to be a few inches longer? Shorter? Why has there been so little disintermediation in the way we shop and dress ourselves? I ponder this as I push the "buy" button, and pay and extra $20 for fast delivery, contemplating all the bells, widgets, gizmos and wheels which immediately starting turning in response. And think back to this blog entry, which was based on a lot of thinking I did in 2006. 10 years!! Our poor planet.

October 6, 2016/
So stoked....I am going up to MIT Media Lab's all day workshop this Saturday, to learn about programming in Augmented and Virtual Reality as part of their Reality, Virtually Hackathon...while I freely admit that a portion of the nitty gritty programming will undoubtedly be over my head, I'm going to get a crash course and overview of the essential process, by all the companies who are the big players in the space. I'm well chuffed, as they say in the UK. Companies presenting include Unity, the programming language used to create both Augmented Reality, and Virtual Reality; Microsoft - who is involved because of the Hololens; Google's Tango, which is technology that helps devices understand where they are spatially, and in the world , and others. Here's the full agenda. Don't fall asleep 😉

October 4, 2016/
Augmented Reality is projected to be a $120 billion market by 2020 in the US alone; I'm looking at starting a company there next. Fascinating technology with a ton of potential applications, far beyond mere gaming. It's advantage is that it overlays digital onto the real world, vs having to be completely immersed in one as Virtual Reality is, so it can be used throughout the day and in many natural environments - you don't have to choose when to use it. Harvard Business Review has a short article just published about the Mainstreaming of AR...it has been around since 1968, but 2016 is when it's starting to take off because of hardware. AR is less sexy than virtual reality, but has more potential for growth IMO because 1) you don't need a lot of hardware/gear for it 2) you don't need to have a dedicated space for it 3) people aren't getting sick from using it (although I have no doubts that will be remedied) and 4) you don't need to immerse yourself in it completely, shutting out the world. Although I do seem to recall people said much the same about television when it launched (it will "never take off" since people have to sit and watch it, not doing anything else). So much for predictions and futurists. I'm going up to Boston to take part in MIT Media Lab's Reality Virtually hackathon this weekend - we'll see what that's like; hoping to meet people, network, and get a real sense for what's happening out there.  

September 27, 2016/
Using the HTC Vive: note, this is not me.
Using the HTC Vive: note, this is not me.
Went exploring an underwater shipwreck with an HTC Vive tonight, complete with schools of fish, sun rays through the water, jellyfish and a huge whale swimming up to me. Was a full room VR demo - I had an 8x8 space to walk around in. What fun! It felt amazingly real from the get go - and boy did the "real" room seem drab after being submerged in a hyper colored world. The sunlight piercing the water above me was perfectly rendered through virtual waves - it really was just like being about 50 feet underwater, standing on the deck of a sunken ship. I had fun trying to "poke" the jellyfish that was swimming a little too close to my head for comfort (you *know* it's virtual...but it's hard to remember) and it backed off from my hand every time. I'm sure that's how the developers are dealing with the fact that although I can see it, and walk through it, there's no real physics going on: I can't feel any interaction. Although, as per one of my previous posts...there are a multitude of companies working creating physical weight surface interaction in VR. When VR does get to the point where you are able to reproduce physical interaction, I'm not sure why you'd want to leave to be honest. The world is prettier, brighter, programmable - much better than reality. It feels remarkably similar to what I felt like after watching avatar back in 2011, only better. No pics of me flailing about in public with the headset on. Probably for the best. Would love to try this game. I'm not much of a gamer but I think I could be convinced with Virtual Reality. It looks eminently hyper realistic, and if even close to the experience I had with the fishes - a joy to get lost in.

September 13, 2016/
Twice actually 😉 There was a {lively} Facebook discussion in the Global Tech Women group, in response to this casting call for a new cable TV show about women entrepreneurs looking for angel investors. The discussion really took off, over the age cap in the requirements:
“Are you a female tech entrepreneur looking for venture capital? We’re an established NYC TV production company looking for dynamic women 21-40 for a new series we’re developing for a major cable channel. If you’re working on an app or another tech project and looking for an angel investor, we want to meet you!”
The gist of the Facebook discussion was that women already have such an uphill battle in the tech/VC world, why does the show want to only feature young women? Particularly when every other business "reality" show features men and women who are all over 40? - Viewing demographics seems a too facile answer. At any rate, Sage Lazzaro was listening and wrote an article about it; here's the link. And here's my accompanying graphic, to illustrate the Facebook discussion.
Load More

End of Content.

Scroll to Top