Category Archives: Every Little Thing

Tokenize Every Little Thing (ELT)

[Since first writing this article in January 2018, I’ve concluded that Ethereum is not capable of being a platform for Tokenizing Every Little Thing. Ethereum is a one-trick pony x 1500 when it comes to creating large-scale decentralized applications (i.e. Ethereum/Solidity smart contracts are best for creating single, simple entities like alt-coins). Checkout slide 56 of this presentation: NEO Blockchain Vancouver 20180315 Meetup. The NEO Blockchain and NEO Smart Economy is the best available 3rd generation distributed application platform on the planet and improving every day. Michael Herman, March 17, 2018]

[Also checkout the webcast The NEO Smart Economy, Smart Processes, and Smart Data. Michael Herman, April 9, 2018]


 

Just over one year ago, I introduced the concept of graphitization and talked about #Graphitization of the Enterprise. I opened the article with the challenge:

Move beyond digitalization of the enterprise to graphitization of the enterprise.

For 2018 and beyond, the challenge is simpler but more difficult:

Tokenize Every Little Thing (ELT)

To provide more context, let me first quote from the introductory paragraphs of the #Graphitization article.

Here’s a great diagram that explains this concept [graphitization]. (click on the diagram to enlarge it)

graphitization-new-world-of-it
Figure 1. The New Model of IT

Graphitization of not only all of your corporate information assets across all of your constituencies and stakeholders – at the data, application entity, and business object level – but also the graphitization of all of the interconnections between every business process, application system, infrastructure component, cloud service, vendor/service provider, and business role that uses, manages, or stores corporate information (Crossing the EA Chasm: Automating Enterprise Architecture Modeling #2).

Use graphitization to make your existing corporate information more available, more usable, and more informative. Graphitization enables you to “Keep Calm and Have IT Your Way“.

What is #Graphitization?

#Graphitization is a data science and enterprise architecture-inspired framework and process model for modeling, ingesting, organizing, analyzing, and visualizing any domain of endeavor by using graphs – networks of connected objects and relationships with each object and relationship annotated with additional descriptive information (metadata).

Why #Tokenization?

Given the burgeoning preoccupation of the world’s business, finance, government, and technology sectors with blockchain technologies, cryptocurrencies, and token-this and token-that, the buzzword for 2018 will be #Tokenization …the creation of tokens (multiple versions of tokens) to represent every thing on the planet …Every Little Thing (ELT).

While individuals, startups and larger organizations are trying to dream up the next big, one-off, token or crytocurrency, why not just admit that, “in the end”, everything will be represented by a token?

Why try to knock these off one at a time (e.g. Bitcoins, Ethers, altcoins, CryptoKitties, letters of credit, auctions, escrow agreements, electronic health records (EHR), electronic medical records (EMR), etc.) when the ultimate goal to to create a universal interconnected graph of ELT (Every Little Thing) in the universe?

Why #graphitize the enterprise when you can #tokenize the universe?

What is #Tokenization?

Let’s get a little computer-sciency for just a minute. A common task to to take an input stream (a string of characters, a stream of data, a data file or database table), analysis it, and convert it into a collection or sequence of higher-level tokens for further analysis (a process that can be applied recursively). Here’s an explanation from Wikipedia

In computer science, lexical analysis, lexing or tokenization is the process of converting a sequence of characters (such as in a computer program or web page) into a sequence of tokens (strings with an assigned and thus identified meaning). A program that performs lexical analysis may be termed a lexer, tokenizer, or scanner… [https://en.wikipedia.org/wiki/Lexical_analysis]

…and later in the same Wikipedia article…

Tokenization

Tokenization is the process of demarcating and possibly classifying sections of a string of input characters. The resulting tokens are then passed on to some other form of processing. [https://en.wikipedia.org/wiki/Lexical_analysis#Tokenization]

Coming up for air… Why not represent ELT that happens in the universe as a stream of blockchain transactions?

  • the events in your life?
  • everything that occurs during a Presidential election?
  • the 24-hour cycle of one day changing into the next?
  • the activity-by-activity and task-by-task execution of a business process?
  • a stream of events from your Internet-of-Things (IoT) enabled car, toaster or refrigerator?

Jim Gray and TerraServer

In one of his several presentations on Scalable Computing (circa 1999), Jim Gray (relational database pioneer and Turing Award winner) describes the TerraServer project in the following way:

[Users navigate] an ‘almost seamless’ image of earth.

SkyServer was a similar project quarterbacked by Gray:

TerraServer allowed access to newly-available satellite imagery with resolution of 1.5 meters/pixel. SkyServer, a collaboration with Alexander Szalay and his colleagues at Johns Hopkins, allowed access to astronomical data from the Sloan Digital Sky Survey. SkyServer led to additional work with astronomical data, … [https://amturing.acm.org/award_winners/gray_3649936.cfm]

Tokenize Every Little Thing

With the advent of blockchain technologies (in particular, the Ethereum extensible blockchain platform), why can’t we embark on a grander mission to tokenize Every Little Thing? …and including all token-pair relationships (TPRs).

What will it take?

What needs to change in the Ethereum blockchain platform? Will Ethereum be able to scale to support modeling, ingesting, organizing, analyzing, and visualizing of Every Little Thing (ELT)?

On your mark, get set, …

Best regards,
Michael Herman (Toronto)

mwherman@parallelspace.net

Other Important References

  • Gordon Bell, MyLifeBits MSR Project (early 2000’s). I remember Jim Gray telling this story but I had trouble finding a proper reference because I thought it was Gray’s story instead of Bell’s.  I now know better but I’ve already finished the above article. A Wikipedia MyLifeBits reference can be found here. YouTube videos can be found here, here, and others over here. Channel 9 videos: Part 1 and Part 2. Computerworld article (2008). Business Inside article (2016).
  • Gordon Bell’s MSR web page.

 

3 Comments

Filed under Architecture Reference Models, blockchain, Business Value, Data Science, Enterprise Architecture, Ethereum, Every Little Thing, graph database, Graphitization, How do we think, Nethereum