Keywords

1 A View on the Current Situation Beyond the Older Economics

We are facing to a harsh reality. First, the superstars like Vanguard and BlackRock (extraordinary asset management corporations) has now dominated nearly the world. These corporations directly or indirectly dominated almost 80% of international capitals worldwide. In this century, we cannot miss corporations that operate above the governments. More interestingly, these financial institutes are not regulated by the banking acts. A traditional macro/financial model, whether orthodox or heretical, no longer holds. Second, the reasoning method is also changing. Previously, we believed the reasoning must be achieved from induction to deduction by way of approximation function. We did not believe in an alternative scientific method. However, it may be the new normal to believe in transduction, which is represented by machine learning. In this context, we will also be forced to change our style of communication, whether money or information.

1.1 The Harsh Reality and Its Dominance of the Fast Track Path to the Future

Given such a reality, where the free market mechanism has broken down, evolutionary economics needs some game changing ideas. Even Nobel laureates no longer provide us with future insights. Creative Destruction, which is a very famous term of Schumpeter, the pioneer of evolutionary economics, literally is under way in the world. Sometimes this word is replaced with the Great Reset. However, we still have no idea of what the coming system is after the destruction. We have not established any measure on the coming society.

The new coronavirus pandemic has changed lifestyles worldwide, which are unlikely ever to return to their original form. This great transformation will change the nature of the socio-economic system itself and will be centered on digital designs. This direction may suggest agendas which the United Nations (UN) and the World Economic Forum(WEF) proposed around the Sustainable Development Goals (SDGs).Footnote 1 In fact, such a fast track to progress will be a dominant factor to determine the future path. However, the human history is not necessarily uniquely established in advance. We may thus avoid identifying the future track with the established acts in this book.

Econocentrism Containing Monetary Exchange

At present, money is beginning to undergo a major revolution. Many books dealing with digital designs and innovations have been published, but few if any of them focus on monetary and analytical methods in the way that this present volume does. Taking into account the new advancement of the monetary exchange, our book will be called econocentric.Footnote 2

1.2 The Limits of Older Economics

Dealing with the new attributes brought about by this great change will be beyond the scope of traditional economics. Digital tools such as blockchain, cryptocurrency, and crypt assets as well as distributive ledger systems, require new modes of analysis. First, the evolution of money and complex thinking necessary for understanding that change must be analyzed. Furthermore, the way that goods markets are mutually coordinated and the future of the labor market must be understood, points that are emphasized in the first section of the book. Second, in the latter part, other computational approaches to social dilemmas, cryptographics, and the supply chain are introduced in the latter part. To facilitate understanding of the core engine of market capitalism, the detailed settlement mechanism in terms of an AI market experiment are presented.

To data, traditional economics have chosen to argue “market in general”, as symbolized by general equilibrium theory, which was originally established in 1870’s. In reality, unfortunately, there is no longer a held “generality” by any means, as Leon Walras firstl expected. To depict the modern market, it may be not only inappropriate but also boldly far-reaching for us to discuss the market in general.

It is difficult for the classical coverage of the economic reasoning to analyze the contemporary system, which we imagine. However, we temporarily illustrate how classical economics approaches the modern macroeconomic system, according to Schefold (2021). Schefold assessed the Cambridge/Keynesian economics inheriting from classical economics in the following manner:

Piketty shows that larger fortunes tend to be associated with higher rates of return on financial investments, and in this modern reality is different (Piketty, 2014), but we cannot analyse here how the rich get richer. ... I only want to stress that the Cambridge economists could not easily leave the narrow framework of steady state analysis, because it was associated with the constant capital-output ratio, but they had no theory for its determination. Schefold (2021, 13)

Here the steady state analysis is connected with the assumption of a constant capital-output ratio. The idea of Joan Robinson, Keynse’s best desciple, regarded that there were few reswitchings of the choice of techniques happening excepting around the corner points, in the event that there were held a constant capital-output ratio almost everywhere. So the dynamics of macro economics will be concentrated on the field of investment, in particular, behaviors of investors.Footnote 3

[O]nce a production function is given, Keynesian analysis is inherently more flexible, not in the sense of agnosticism, but by emphasizing forces that matter in the real world, principally the behaviour of investors. ... The investment climate can be assessed. Neo-Fisherians believe that moderate rises of interest rates and prices might go together, involving rational expectations. With a constant capital-output ratio, the effect of a small and slow rise of the interest rate similarly seems precarious, but possible for a Keynesian, in what Joan Robinson called a state of tranquillity that is not disturbed by a process of substitution (Schefold, 2021).

Fortunately, we recently acquired two major technological advancements. One is AI in general, the other is Bitcoin or cryptocurrency in particular. The latter is a byproduct in association with blockchain technology. But the impact of either Bitcoin or Ethereum is enormous in theory and in practice. In particular, Ethereum is more interesting in the sense that it also provides with smart contract. This has the potential power to innovate the economic system, quite possibly, the social system.

A Note on Quantum Financial System (QFS)

Finally, we need to take note of the new advent of quantum financial system (QFS). As IMF noted in Fall 2021, “Quantum computers could crack the cryptography that underpins financial stability”Footnote 4 QFS could be also an alternative to Bitcoin and so on. Quantum bit may be superior to 0, 1 bit. However, the current crypto money will still stay in the power of decentralization and community consensus.

How the Contemporary System Differs Much from the Image of Classical Economics

We are instead interested in how the contemporary system differs substantially from the image of classical economics. Fortunately, we recently acquired two major technological advancements. One is AI in general, the other is Bitcoin or cryptocurrency in particular. The latter is a byproduct in association with blockchain technology. But the impact of either Bitcoin or Ethereum is enormous in theory and in practice. In particular, Ethereum is more interesting in the sense that it also provides with smart contract. This has the potential power to innovate the economic system, quite possibly, the social system.Footnote 5

1.3 Some Instances of Using Blockchain

It is noted that blockchain is not simply a technology for crypt-currencies. Blockchain also is used a censor technology, e.g., in order to retrieve a defective lot in a factory system of the manufacturing industry.

Currency Exchange

Among them, we note the IBM blockchain applied to the currency exchange, which provides the financial institutions with a better opportunity by way of a world wire network (API): either are transformed into “a stable coin” as the digital asset, which can achieve a simultaneous exchange between the two. Here messaging, clearing, and settlement for the desired exchange will be integrated by blockchain technology without falsifying bookkeeping. The latter usage may be classified into a hybrid application of blockchain in a sense that blockchain is incorporated as a sub-system of the currency exchange. IBM Blockchain is an instance to deliver value around the world. See https://www.ibm.com/blockchain

A Simple Auction Design on the Ethereum Platform

In 1993, Nick Szabo described how users could input data or value/then regain it from the transfer system as if it were a digital vending machine. Ethereum shares the same function as Bitcoin in a sense that the network can transfer value from one to another. Different from Bitcoin, Ethereum provides each node with additional information account, which could function as the distributive ledger for each, while Bitcoin only transfers the value of currency (Fig. 7.1).

Fig. 7.1
An illustration. An Ethereum network has access to all data with transactions' codes, and it cancels other contracts. A user calls a smart contract, only one user receives the results, and each miner executes the contract code. Transaction codes do not modify anything beyond the wild network.

*Cited from https://vas3k.com/blog/ethereum/ but the chart was originally produced

Smart contracts on the Ethereum platform.

Turing Completeness and Smart Contract

Ethereum is called a system written in Turing complete language. As a Turing Complete system means a system in which a program can be written that will find an answer, Ethereum, if it is, allows to program autonomous agents. The contract will be called smart contract.Footnote 6

The features of smart contract is summed up as follows.

  • Smart contracts are fulfilled with multisignature accounts, which imply that “funds are spent only when a required percentage of people agree”.

  • Smart contracts can manage agreements between users, say, if one buys insurance from the other.

  • Smart contracts provide utility to other contracts (similar to how a software library works).

  • Smart contracts can store information about an application, such as domain registration information or membership records.

2 Some Fundamental Changes of the Socio-Economic System

The economics of production of the last century depend on capital and labor. In the 19th Century, the factors of production was land and labor. Land has been replaced with capital as the capitalistic production develops. This suggests that production style is not always fixed. To our understanding, the style is now transitioning, when a universal tool making machine actually appears. Siemens has already provided this kind of entity.Footnote 7

Almost every input will be invisible excepting program codes. In this entity, substitution between factor inputs will not matter in view of cost comparison. In other words, this entity will no longer guarantee the idea of a continuous substitution between physical factor inputs. Thus, this kind of the entity will not only drastically reduce the time of production and the related cost of labor input but will also make invalid the traditional idea of a choice of techniques. Now, the economic distributive principle will not be guaranteed by a so-called production function analysis; thus, we must prepare a new analysis of production, although we are faced with too many corrective tasks to establish it along the traditional grounds. In this chapter, we rather just refer to some fundamental problems of traditional production analysis.

2.1 The Short-Run Production Function and Its Aggregation Form

Hildenbrand’s production function study was published in Econometrica 1981 (Hildenbrand, 1981), but did not see the light of day until (Dosi et al., 2016) drew attention to it. As is well known, the law of returns is assumed a priori in the traditional production function. However, Hildenbrand (1981) drew a short-run production function using the example of the Norwegian tanker industry in 1967 (377 vessels with a load capacity of over 15,000 tons). The tankers were of various types, 57 were turbine-driven and 320 were motor-driven, and their production dates ranged from 1950 to 1966. The output of the tanker industry is tonnes per day times miles transported. Following tradition, only two inputs are considered: fuel and labour. Referring to the work of Johansen and Eide, the production function for the Norwegian tanker industry is shown in the upper panel of Fig. 3 of Hildenbrand (1981, 1100). As shown in his figure, the techniques used in the industry may represent some combinatorial compositions. A simple understanding on firms’ technology will contradict reality. In fact, the industrial technology is based on a wide range of unequal productivities reflecting a complex technology. Thus the real isoquant of the industry will not compose a typical isoquant on the material(capital)-labor plane.

According to Hildenbrand (1981), this idea differs from the traditional production function. We define the projection of Y on the input space \(\Re _{+}^{l}\):

$$\begin{aligned} D= \{V \in \Re _{+}^{l} | (V, X) \in Y \, \texttt {for some} \, X \in \Re _{+} \} \end{aligned}$$

It then hods the traditional production function:

$$\begin{aligned} F(V) = \max \{ X \in \Re _{+} | (V, X) \in Y \} \end{aligned}$$

The operator \(\max \) in the above has excluded the possibilities of “certain institutional barriers to factor mobility in aggregating the individual production sets” (Hildenbrand 1981, 1097).

In general, the ex post technology of a production unit is a vector is a production activity a that produces, during the current period, \(a_{l+1}\) units of output by means of \((a_1,...,a_l)\) units of input. The size of the firm is the length of vector a, i.e., a multi-dimensional extension of the usual measure of firm size (Fig. 7.2).

Fig. 7.2
A 3-D surface graph of input 3 versus input 1 versus input 2. It plots 4 arrows, a 1 to 4 from input 3 equals 0, which further form a 1 + a 2, a 2 + a 3, and a 3 + a 4. These 3 further form a 1 + a 2 + a 3 and a 2 + a 3 + a 4, and finally form a 1 + a 2 + a 3 + a 4 at input 2, which equals 3.

*The author depicted this figure by Mathematica

Hildenbrand’s short-run production set and its three-dimensional zonotope.

2.2 An Alternative Production Set of Zonotope-Basis

Given \(Y=(a_1,\ldots , a_l)\), a set of generators for n, the zonotope Y is the convex hull of all vectors of the form a; that is (Y) is the Minkowski sum of all segments \([0, a_i]\), where \(a \in Y\), i.e., \(\sum _{a_i \in Y} a_i\). Also Z(Y) is the shadow of the r-dimensional cube \([0, 1]^{r}\) via the projection in \(\Re \):

$$\begin{aligned} Vol (Y) =: \sum _{1\le i_1 \le i_2 \cdots \le i_l \le N} |\Delta i_1, \ldots , i_l| \end{aligned}$$

Here \(|\Delta i_1, \ldots ,i_l|\) is the module of the determinant. This kind of discussion will suggest a new growth/innovation theory of production. It is noted that Vol(Y) measures the volume of a rugby ball. The idea of Volume is defined by Dosi et al. (2016). The abosolute measure is the Gini volume of the zonotope, which could be regarded as a generalization of the well known Gini index. Let \(Vol(P_Y)\) be the volume of the paralletop \(P_Y\) of diagonal \(d_Y = \sum _{n=1}^N a_n\), that is the maximal volume we can get when the industry production activity \(\sum _{n=1}^N a_n\) is fixed. It then hold the Gini volume:

$$\begin{aligned} Vol(Y)_G =\frac{Vol(Y)}{Vol(P_Y)} \end{aligned}$$

Thus we can empirically measure the productivity growth by the change of inequality. It also is noted that the idea of smaller firm size does not necessarily generate greater growth of the industry.

2.3 The Effect of Financial Industry on the Real Economy

The present economic system seems to be keenly examined in terms of several insights of the new coordinates. The advent of financial big bang was too shocking to lose sight of the effectiveness of production economy. However, it is easy to verify that the effect of financial activities is extremely small in view of value added or income creation if we measure its effect by a macroscopic GDP aggregate. As Table 7.1 shows, the contribution to GDP of the financial industry, even altogether with the insurance industry is merely 3.2%. This ratio is also within 10% even in the United States. Readers will be surprised to know how small the income creation capability the financial industry is, despite the extraordinary financial wealth creation. Therefore, in summary, we are forced to recognize the dichotomy of economy into the real one and the financial economy. Both linkage is really thin.

Table 7.1 The input-output table of 13 sector, 2015, Japan at production prices

2.4 The Landscape of Exchange Systems

Now we inspect how the market exchange will be constructed. The classical image of market exchange breaks down by reflecting both the rapid change of the market organization and the communication system. For the market organization, as shown by Mirowski (2007), a market is repeatedly either spun off or complemented from the underlying market, forming a highly complicated layered system in the evolution of market. On the other hand, as a rapid change of the communication system locally and globally, a series of new exchange mechanisms has appeared. These may be an essential concept for understanding the modern market system and the contemporary societal system. Incidentally, non-symbolic methods may reflect fast thinking, while symbolic methods may reflect slow thinking, as Kahneman (2011) pointed out.

Among financial technologies, HFT is a striking factor. The operation speed is on the order of microseconds, preventing much working from being performed by human intelligence. In fact, the average processing speed of Bitcoin is about 10 min, leading us to excessively depend on computing ability such as “proof of work”. ..... [On the other hand, the] horizontal axis shows the type of processing, from less symbolic to more symbolic. Symbolic processing implies the so-called programming based one. Through the recent rapid progress of pattern recognitions due to machine and deep learning, nonsymbolic processing will begin a new stage of AI work to replace previous human works (Aruka, 2020, 380).

Through the recent rapid progress of pattern recognitions due to machine and deep learning, nonsymbolic processing will begin a new stage of AI work to replace previous human works.

The Hyper-Speed Domain: Relaxed Static Stability

This observation may call us “the relaxed static stability”. Depending on the idea of “fail safe”, structural stability has been important in designing planes and ships up to now. On the contrary, a modern stealth fighter like Lockheed F35 is designed by the idea of relaxing “static stability”. It is well known that bicycles behave much more quickly than tricycles, and attempt to restore its instability by resort to their countermotion. The countermotion will be controlled by computational powers. Thus, we expect that some nonlinear effect is implemented behind the financial flash crashes, always accompanying a countervailing power.Footnote 8

The Slow-Speed Domain: Distributive System of P2P Exchange Network

In contrast with the hyper speed domain, it matters “the slow speed” to employ blockchain technology. Here the word “slow” may be interpreted as “nonhyper-speed” or “nonmicrosecond”. In view of a certain scaling, blockchain may be enough speedy excepting the mining speed of Bitcoin. In this sense, it is easy to identify the crypt-currency using blockchain technology with a slow speed technology. More interestingly, there currently exists a world wide nodes network at the basis of P2P transaction. It is interesting to see a global bitcoin node distribution which is alway accessed from the homepage https://bitnodes.earn.com/nodes/live-map/.

The Difference Between the Traditional Transaction and the P2P Transaction

Even HFT transaction requires an “auctioneer”. Although the “auctioneer” is a computer server, even HFTs usually assume a centralized system of both double auction and batch auction on the restrictive administration of two rules: time priority and price priority. Thus, no trade holds without matching coordination or the continued announcement of the matching results. However, the size of information generated in the HFT transaction rapidly increases due to the faster speed of settlement.

P2P Transactions also Generate a Rapid Growth of Information

P2P transaction on the basis of blockchain mining. As the number of transaction node increases, a geometrically rapid growth of transaction information is generated due to its network properties. A practical application of P2P transaction may be feasible just when the grand infrastructure for mining is arranged at any rate.

2.5 Transductive/Symbolic Reasoning

It was noted in the earlier section that symbolic/non-symbolic coordinates will be a key coordinate to identify the parts of the socio-economic landscape. We will then examine this reasoning. Brownlee (2017) compactly illustrates transductive learning as follows:

Transduce:

To transduce means to convert something into another form, i.e., to convert (something, such as energy or a message) into another form essentially sense organs transduces physical energy into a nervous signal. Merriam-Webster Dictionary (online), 2017

Transducer in engineering:

It is a popular term from the field of electronics and signal processing, where a transducer is a general name for components or modules converting sounds to energy or vice versa. Digital Signal Processing Demystified, Broesch (1997).Footnote 9

Transduction in biology:

The action or process of transducing; especially:the transfer of genetic material from one microorganism to another by a viral agent (such as a bacteriophage). Merriam-Webster Dictionary (online), 2017 (Fig. 7.3).

Fig. 7.3
An illustration of symbolic processing with 6 inset photos of dogs and cats. It includes training a nearest function on a mixed-type dataset with I n of 1 and out of 1 equals the nearest function with input type and output property, and I n of 2 and out of 2 for the nearest element of a new example.

A symbolic processing. *Source Made from Mathematica

Transductive Learning

It is an interesting framing of supervised learning where the classical problem of “approximating a mapping function from data and using it to make a prediction” is seen as more difficult than is required. Instead, specific predictions are made directly from the real samples from the domain. No function approximation is required.

The model of estimating the value of a function at a given point of interest describes a new concept of inference: moving from the particular to the particular. We call this type of inference transductive inference. Note that this concept of inference appears when one would like to obtain the best result from a restricted amount of information (Vapnik, 1998, 169)

Example of k-NN classification

The test sample (green dot) should be classified either to blue squares or to red triangles. If \(k = 3\) (solid line circle) it is assigned to the red triangles because there are 2 triangles and only 1 square inside the inner circle. If \(k = 5\) (dashed line circle) it is assigned to the blue squares (3 squares vs. 2 triangles inside the outer circle). Cited from Wikipedia https://en.wikipedia.org/wiki/K-nearest_neighbors_algorithm. The right figure of Fig. 7.4 is an extended version of the left figure. The center point should be estimated whether blue or red.

Fig. 7.4
An illustration has 2 parts. On the left are 2 concentric circles with triangles, squares, and a question mark scattered from inside to outside. On the right is an extended version of the left figure, with different shade dots scattered from inside to outside.

K-nearest neighbors algorithm and its generalization *The right figure is a three dimensional extension of the left figure. The left figure is cited from Wikipedia article’s figure https://en.wikipedia.org/wiki/K-nearest_neighbors_algorithm#/./media/step/File:KnnClassification.svg

2.6 Science and Ethics

Until now, prediction has inevitably been associated with the following problems, as Klaus Mainzer, professor emeritus at the Technical University of Munich and philosopher of science, has pointed out. In his book Mainzer (2007), Mainzer discusses the “prediction problem” in terms of the Joseph effect (persistence) and the Noah effect (discontinuity). According to Mainzer (2007), we discussed science and ethics.Footnote 10

We have a particular type of self-fulfilling prophecy like the Oedipus effect. According to this effect interpreted by Robert K. Merton, a true prophetic statement—a prophecy declared as truth when it is not—may sufficiently influence people, either through fear or logical confusion, so that their reactions ultimately fulfill the false prophecy. This implies that the collective macrostate of social various orders (order parameter) can be averaged over its parts. We apply this self-fulfilling rhetoric to our rational expectation hypothesis. This hypothesis might be self-fulfilling either through fear or logical confusion, of course. But this prophecy must be faced with some logical failure. It could be verified in the framework of complex dynamics that this hypothesis only referred to a unilateral direction of the whole dynamics in which direction a single rational individual had to contribute to the rational macrostate of economy. Moreover, we need the other aspect of the full feedback: “Its order parameters strongly influence the individuals of the society by orientating (enslaving) their activities and by activating or deactivating their attitudes and capabilities” (Mainzer, 2007, p. 395). This is just the slaving principle as a whole elucidated as synergetics by Haken (1977) and Weidlich (2000). These dynamics are thus encompassed by critical values, outside of which the system falls into an unstable situation. So to be fulfilled, the prophecy might be circumvented.

Consideration in this dimension will contain some ethics by judging whether the effect of Joseph(persistence) or Noah(discontinuity) should be placed.

However, what we are now convinced of is that the advent of the quantum computer age will fundamentally change the system of information communication, including monetary exchange. At that time, the traditional analysis of induction-deduction will be replaced by an analysis based on transduction (machine-learning inference), which will make it possible to make predictions that traditional analysis has missed, and this will make traditional predictions themselves obsolete. The idea is MIT is already working intensively on new predictions (time machines?) based on this understanding. However, the basis of the Great Reset will still be a revolution in computing power and the associated revolution in communication.

To be sure, we are faced with this kind of prediction difficulties. In a near future, however, a new science should get over them in the era of galantic space development by humans, if extraterrenial factors were taken into account. Thus we will explore the networks structure of the time series, i.e., something causal.

Although we attempt to grip only a slight indication, in the following, we will cast a new light on the time series prediction in view of visibility graph. This will suggest a new inferential procedure as a means to obtain something causal to connect the time series with a network structure.Footnote 11

3 The Time Series in View of Horizontal Visibility Graph

3.1 The Visibility Graph

Firstly we define the visibility graph.

By definition, in the visibility graph, each node indicates a sample that is connected to another node if visibility between the two exists.

The operation to connect the segments is then explained in the following manner.

  • According to the rule of visibility graph in between the highest node and the second highest node, we can connect each other of the values. In other words, two vertices(nodes) are connected by an edge if the corresponding events in the time series are larger than all the events between them.

  • Thus a time series will be transformed into a visibility graph as follows. The bars represent the values of the time series. Suppose the initial node is the starting node. Choose the values consecutively until a higher value than the initial value is found. Then, specify the higher value as the end node.

  • Otherwise no lower value than the initial value, the next value after the first will be chosen the end node. Replace the end node with the initial one and repeat the same procedure. The batch of the procedure will be regarded as a cycle.

  • Due to Kaurov (2013) in Wolfram demonstration project, we focus on the starting node of each batch to connect them to form a graph-theoretic shortest-path algorithm (Fig. 7.5).

A horizontal time evolution is thus constructed in view of visibility graph as followsFootnote 12:

The Shortest Path

This study develops both of the time evolution of CA and the network formation in view of HVG. According to Kaurov (2013), we call this development an evolution of a finite elementary cellular automaton (ECA). This can be done in a few different ways. We can consider every step of an ECA evolution to be a binary number and calculate its decimal form by counting digits from left to right or in reverse. The shortest path consists of each starting node of each cycle in view of network formation. We may then measure a difference between each event and each node of the shortest path with realized matching. In his smart demonstration, Kaurov has shown a graph-theoretic shortest-path algorithm by connecting the starting nodes of each cycle in view the HVG. This path is depicted in yellow colors on his HVG network.

Fully Random, Rule-Based Iterated Cellular Automata (FRICA)

In our recent studies like (Aruka et al. 2019, 2020), we focus on the horizontal evolution of our time series generated by the FRICA. Rule 110 proves itself to reproduce such structures. Much more interestingly, Wolfram’s research group has also discovered these from the behaviors of Fully Random, Rule-based Iterated Cellular Automata (FRICA) with multiple rules. FRICA is employed to access how damaging the inclusion of a given rule is to the universal behavior of rule 110, for instance. Thus, we can then detect some critical conditions that may make a certain local structure collapse by changing a selection of the rules. This new observation may bring us a new perspective, at least, a useful hint, on how to measure the market performances.

Fig. 7.5
A graphical representation has 6 vertical sections, with their widths indicated by double-headed arrows. The y-axis ranges from 0 to 1.0. The x-axis ranges from 0 to 1.2. At the bottom are connected half-circle-shaped curves with 7 nodes.

A simple illustration of HVG and its network. *Source The author depicted these figures by referring to the figures appearing from: Lacasa et al. (2008) (By definition, in the visibility graph, each node indicates a sample that is connected to another node if visibility between the two exists)

We apply the ideas of shortest path algorithms to the FRICA to get Fig. 7.6.

Fig. 7.6
2 screenshots. Both have a panel on the left with different values of C A rule, width, and length, along with seed, series, and Q R codes. On the right are network graphs of a typical C A pattern, with fluctuating trend bar charts below. The network graphs have interconnected nodes numbered 1 to 74.

*These figures are simulated by the use of Kaurov (2013)

The HVG and its graph-theoretic shortest-path algorithm in the left/right sided case of the FRICA (rules 150, 25 and 74)

3.2 Exponential Distributions Reflecting Events Recurring at Random in Time

Next, we look on some application of the horizontal visibility graph. Lacasa et al. (2008) has given The exponential distribution derived from the HVG. They then dealt with the cases where “the algorithm captures the random nature of the series, and the particular shape of the degree distribution of the visibility graph is related to the particular random process.Footnote 13 It is thus seen that exponential distributions have a close link to events recurring “at random in time".

It is then interesting to know the logical links of other distributions. Mathematica briefly gives a good illustration of the history of exponential distribution.

Historically, the exponential distribution has been used most widely to describe events recurring “at random in time”, i.e., in circumstances in which the future lifetime of an individual has the same distribution regardless of its present state. The use of the exponential distribution has increased significantly over the last 75 years, due in part to considerable research within the field of order statistics beginning in the early to mid-1950s. Since then, the exponential distribution has been used to model various phenomena over intervals of approximately constant rate, e.g., the number of phone calls placed in a specific time interval each day. In stochastic processes, the exponential distribution describes the lengths of interarrival times in homogeneous Poisson processes. The exponential distribution is also used in credit risk modeling, queueing theory, reliability theory, physics, and hydrology.Footnote 14

Mathematica also gives the logical links between exponential distribution and other distributions. This figure must be helpful to know the internal relationships between them. Usually in econophysics, we like to see a particular link between power law distribution and exponential distribution. But it is also important to observe the other distributions through the exponential distribution.

We refer to the relationships between the exponential distribution and the other well-known distributions in our field.Footnote 15

Power Distribution:

Power distribution is a transformation of an exponential distribution; Exponential distribution can be obtained from power distribution,

Pareto Distribution :

Pareto distribution is a transformation of exponential distribution; Transformation of a Pareto distribution yields an exponential distribution.

LogisticDistribution:

Logistic distribution is a transformation from exponential distribution; Logistic distribution is a transformation from exponential distribution.

Poisson:

The parametric mixture of poisson distribution and exponential distribution follows geometric distribution.

3.3 Contriving a Simple Market to Be Manipulated

Now we narrow down the manipulation in the market exchange. When we employ a virtual market simulator like U-Mart, it will be an easy work whether the market could be manipulated. We commonly implement a so-called technical analytical agent in the U-Mart. In recent years, we have already discovered a special agent set, i.e., Nakajima-Mori agent configuration, to always realize any given real price movement.

The Minimal Nakajima-Mori Agent Set in the U-Mart Futures Price Formation

In the U-Mart experiment, the default standard set of strategies is fixed in the way of Table 7.2. As the bodies of each strategy are simultaneously increased, the possibility to match current orders to settle them may be much bigger. It is noted that the representative strategies of traditional technical analytical agents are employed.Footnote 16

Table 7.2 The default composition

On the other hand, the minimal Nakajima-Mori agent set is driven by many simulations of the U-Mart acceleration experiment in the following way.Footnote 17

Table 7.3 The minimal composition

3.4 Characterizing the Even-Matching in the Market Transaction

In the narrative of double auction, as the transaction continues much longer, the market may sometimes fall into a state of steadiness as a result of even matching forces from both directions of rising and declining. This state of the market is called “even matching”.

As the rising rate declines after the consensus breaks down, we will then catch up again with a sell-off opportunity. The figure illustrates this matching process in the case of rising trend. The right edge of the triangle is the point where the market can move either up or down. Thus, a triangle will be formed just before a breaking out of the price series. The red circle of the figure indicates a breakout point to buy when the price will rapidly increase. The breakout point will also break down if the price no longer rises. This, then, means the end of an even matching process (Table 7.3).

Triangle Formation in the Market Transaction

Interestingly, in view of visibility graph, the initial point of the triangle formation may be regarded as the start of a segment to be connected as the shortest path. Thus the breakdown point also means the end of the same segment of the shortest path. The triangle of the figure just corresponds to a segment of the shortest path of the price time series.Footnote 18

A Breaking Point on the Shortest Path

In other words, the shortest path of our price time series will be interpreted with a whole path connecting each new segment after each, each time the consensus breaks down and the price is wandering about.

At the state where the triangle is kept shaped longer, there may be accumulated orders that were ordered earlier. These previous orders will contribute to the market price either much lower or above a newly assigned order. As a result, the market price will be pegged down to a bottom or a ceiling. The event will often occur in the narrative of double auction (Fig. 7.7).

Fig. 7.7
A chart has a shaded triangle with its apex pointing towards the right. A fluctuating curve moves across the triangle, with an upward arrow on curve's right portion marked break.

Triangle formation in the case of an ascending price trend

The mentioned event will be reproduced as a jumped futures price if we employ the price time series generated by the Nikkei225 spot time prices daily based.Footnote 19

Fig. 7.8
A double-line graph and a histogram trace the trends of futures or spot prices and futures volume. Both lines start at 12500 and rise with fluctuations. One line first rises, then follows a rising step trend. The histogram has a fluctuating trend that peaks at 1900. Values are estimated.

Nikkei225 spot prices case

The Shortest Path of Nikkei225

Finally, we depict the shortest path of this case and its network structure. The latter shows the set of strategies playing the key role in settling down the futures market exchange on the shortest path (Figs. 7.9, 7.10).

Fig. 7.9
A network diagram of interconnected, numbered nodes that form a circle.

The shortest path derived from the Nikkei225 spot time series

Exercises

7.1

Describe the current changes of the style of production, compared to the traditional capitalist economy which was based on machine-made large industry, and mention a set of decisive changes including the change of soical/private life style.

7.2

Try to sort out the attributes brought by the Great Change, connected on the capital-labor system with AI penetrations in your own way. Taking into account transhumanism, discuss this subject.

7.3

Discuss how the stock/currency exchange system were altered by the shift to the HFT (High Frequency Transactions). In particular, discuss if the new change made the original system much more efficient or not.

7.4

List up the usage of blockchain other than financial tools as you currently regard them as promising.

7.5

According to Hildenbrand (1981), the short-run production function was verified to be of zonotope-typed set. Describe in what ways the zonotope-type production function differs from the traditional production function.

7.6

Traditionally, we use some approximation function to infer the values of the function at points of interest. Induction is an operation to draw an approximation function from some examples. On the other hand, deduction is an operation to draw the values of the function at points of interest from an approximation function. In this context, discuss what kind of inference machine learning will achieve (Fig. 7.9).

7.7

The “agglomerate method” is often used, which is a hierarchical clustering method. “Agglomerate” works when clusters have similar densities and are isotropic like “colors.” Sometimes symbolic processing is possible. In some case, we need some training to tie some pictures with “cat is grey,” another picture “cat is fast” and then detect any nearest feature. These detction perform a desired classification/identification between pictures. Learn the “agglomerate method”, and then apply it to your target. Hint: Use the operator “Agglomerate” (Machine Learning Method) if you like Wolfram Mathematica.

7.8

In the text above, “[t]he U-Mart system is an artificial intelligent futures transaction system of long-running lifetime initiated by Japanese computer scientists since 1998 (see Aruka (2015, 111–112) and Shiozawa et al. (2008).” In the U-Mart system, the so-called traditional technical agents, either human or algorithmic, can participate in the play. Try to summarize briefly each technical agent of the U-Mart, and discuss the usefulness of the minimal Nakajima-Mori agent set (Fig. 7.10).

7.9

Illustrate the idea of the shortest path of Horizontal Visibility Graph in your own way, and then draw a numerical example of it as you like.

7.10

The Wolfram Demonstrations project has a useful tool like “Horizontal Visibility Graphs for Elementary Cellular Automata” (Kaurov 2013). Access to https://demonstrations.wolfram.com/HorizontalVisibilityGraphsForElementary/CellularAutomata/ to try some several simulation on which cases you are interested. And then discuss some advantage on the visibility graph, i.e., the transformation from the network development to a time series representation in your own way.

Fig. 7.10
A network structure with 10 nodes. The nodes are labeled S F spread strategy, moving average strategy, R s i strategy, S random, random, S R s i strategy, day trade strategy 1, trend strategy, anti-trend strategy, and S moving average strategy.

The network structure of the shortest path derived from the Nikkei225 spot time series