Lisbon, Portugal

Phone: +351 211930892

E-mail: contact@fransis.foundation

Address: Rua Campo Grande 28, 7C, 1700-093 Lisbon, Portugal

  • Facebook - White Circle
  • Twitter - White Circle
  • LinkedIn - White Circle

Business Tokenization approach

In times of Big Data analysis and automatization processes new paradigm of business and economy evolution has appeared. Based on the composition of modern technologies like social networks, AI, DLT, FinTech, humanity has discovered a new way of social-economy organization — internet of value.

Internet of value


The main subject of the internet of value implementation is tokenization — digitizing and automatization process of economy metrics layers: finance, law and social impact. Simply put, the goal of tokenization is to create and to maintain a transparent and safe database of economy interactions, forming the basis level of trust for b2b automatization.

Internet brings a lots of benefits, enabling such market giants like Facebook, Google, Amazon, Alibaba, Uber and etc., so the right question is what exactly they have been discovered and monetized? The secret weapon of IT giants is database organization: collection, analysis and storing of data become of strategic value in terms of future usability. Anybody can run a neural network to analyse data, while only few companies focus on data collection and storing.

However, generally, SME do not have technological education and, most importantly, do not have incentives to separate, to store and to analyse their data. Most of those strategic operations are made on behalf of 3rd party companies, like Uber or Alibaba, without any liability of that data analysis and commercial usage. As an example, taxi drivers suffer not from the Uber activity, but rather from unavailability to share and use analytics of traffic and customer network.

Tokenization in terms of Big Data becomes a powerful instrument to digitize economical relations and to create decentralized databases. These databases will provide its owners with very cheap business intelligence, market transparency and smart liquidity, independent and protected from 3rd parties.

Tokenization allows to operate with value as data wit minimal friction


Tokenization is a process of economy database development with using special blockchain programs — smart-contracts, that will store and replicate the transaction data with stated logic. To automate tokenization process we use bottom-top-bottom crypto-economy approach.

This approach is based on the definition of economy as a composition of economic sublayers — data sets — described with different incentives and correlated with various economy activity groups:

  • (C-set) Community layer represents customer segments with different payment methods. Main incentives form a demand function, ex. non-professional investors and short-term investments;

  • (P-set) Production layer represents value proposition from producers. Main incentives form an offer function, ex. collective investment marketplace and liquid short-term low-risk investments;

  • (T-set) Trading layer represents financial supply and is formed by liquidity providers. Main incentive is to maintain a stable capital flow, which forms a supply function, ex. decentralized escrow network;

  • (O-set) Organization layer represents business activity management and is formed by the strategic resources and partners. Its main incentive is to keep all layers in profitable equilibrium, it operates with security function, ex. curators committee and anchor investors.

Bottom-top-bottom crypto-economy approach

Historical observation of economy evolution shows the following direction: initially, market is formed by the demand from the community, represented by C-set, then — by the offer from producers from P-set as a proper subset of C-set (all producers also belong to the community, but not all of the community belongs to the producers set). After that, market forms a financial supply represented by T-set as a subset of P-set (all producers have revenue streams), to process the demand/offer equilibrium, and, in the end, market creates a security organization represented by O-set as a subset of T-set (organization takes a fee from the turnover) to protect and regulate all layers. This set-composition is equal to the KPI of the economy with the demand-offer-supply-regulation architecture.

Inductive bottom-up logic analysis creates a theory of economy functionality, where deductive top-down logic development verifies empirical data and creates a verified database:

Alice(O) is a manager, she knows Dorothy’s(C) data and manages paperwork, she works with Bob(T) who manages accounts and money. Charlie(P) is a producer and he produces a product every time when Alice(O) and Bob(T) confirm, while Dorothy(C) gets a product and pays.
We create a theory that Alice(O), Bob(T) and Charlie(P) have a company that provides Dorothy(C) with product, where Alice(O) has incentive to acquire more customers, Bob’s (T) incentive is to sell more product and Charlie(P) wants price over quality. That means that set O is a government sublayer of set T (Alice can’t change the price, but only manages customers like Dorothy), set T is a subset of set P (Bob can’t receive money without Charlie’s activity but manages the price policy), and set P is a proper subset C (Charlie is a part of community with Dorothy, but he is not a client of the business) so we can describe this logic as offer(P)~price(P)/quality(P)~price{supply(T)/security(O)}/quality(P).
When we trace the product and money, we verify the following: Dorothy(C) buys Charlie’s (P) product and generates a bill-report, that is verified by Bob(T) and is analysed by Alice(O), and is transformed into marketing and price policy to stimulate Dorothy(C) to buy Charlie(P) product again. That means that set C generates the demand, covered by the offer of the set P, so set T generates a fiscal signal that is transformed into a security strategy by the set O. So we confirm, that business theory, represented by logic demand(C)~offer(P)/price(supply(T)/security(O)), is profitable with some probability.

As a result, this theory can be applied to smart-contracts development and integrated into the business process as a set of KPI, that provides the theory with empirical proof-of-concept and could be used for business valuation, machine learning analysis and forecast.


Most common subjects of business consulting nowadays are digital intelligence and databases. These subjects became popular and very effective, because data analysis provides business with unique and very useful service — smart organization of business customers, accountings, investors, marketing, products delivery and etc — in short, with business intelligence.

Tokenization as a financial instrument, that is used to create and to update economical data, in order to provide business with simple and effective process of fundraising and marketing — crowdsourcing.

Thus, a company from the start begins to store strategic information about revenue streams, customer network, assets distribution and financial circulation, etc., combining these indicators into a smart KPI system with transparent public and secure private accesses, so that it can attract investors with financial forecast, talented employees — with the level of organization, and customers — with community incentives engineering.