Blockchain Overview and Quantum Computing Capable Threats

PUBLISHED ON MAY 3, 2018

Nicolas Acton and Robert (Randy) Soper

Project report for CS-5560 Fundamentals of Information Security, 3 May 2018

Abstract

The blockchain technology ecosystem, including the initial use-case application of Bitcoin, is built on two primary ideas, implemented through a range of technological approaches, that provide widespread public confidence in the security of platforms built on the technology [1], [14]. The first is a public, distributed hash-chain to prevent changes to posted "blocks".  The second depends on schemes for consensus creation across a decentralized community of third-party participants.  Together these processes provide integrity and resilience. The Blockchain itself does not provide confidentiality or non-repudiation for individual block transactions [13]; most applications, like cryptocurrency, adopt standard signing and encryption approaches [11]. Blockchain uses novel approaches to create an indelible public ledger without requiring trust in a centralized authority, but many of the security sub-components of Blockchain implementations rely on standard information security technology and protocols, like RSA or ECDSA public key signing, or SHA hash algorithms.  Absent consideration of vulnerabilities associated with collusion within the distributed third-party community, or adoption of vulnerable schemes for consensus creation, at present the underlying cryptographic practices of the Blockchain will likely remain secure indefinitely from classical computer architecture attack so long as they are updated in advance of computational capability [14]. Some have suggested that potential applications in the emerging quantum-computing architecture call this understanding into question and could pose a credible threat to the security practices of the distributed ledger [13].  We suggest that the fundamental Blockchain security functionality does not seem to be substantially vulnerable to quantum computer attack, though erosion of hard boundaries of collusion limits may erode public trust in the Blockchain’s capabilities.  The most immediately exploitable vulnerabilities result from block/transaction-level privacy and non-repudiation algorithm and protocol selections.

1. Blockchain Security

A Blockchain is simply a distributed ledger built one or more cryptographic technologies, depending on security requirements.

1.1. Hash-Chaining

The primary foundation upon which the Blockchain is built is the block header. The header is composed of (at least) a unique identifier for the block, a hash of the current block, and the hash of the previous block. Frequently a nonce and/or timestamp is also included; some Blockchain schemes use the calculation of the nonce to meet an allowed hash form as Proof of Work (PoW) in consensus creation [13]. The chain of hashes present in the block headers each depend on the previous block.  This creates a chain, any segment of which can be verified including the entire sequence to the first block. Typically the values are created using standard validated hash algorithms, like Secure Hash Algorithm (SHA), to ensure high complexity in creating a preimage of the block during processing and network distribution, preventing an attacker from spoofing a spurious block into the chain [15]. Since changes imposed on any blocks or hashes will result in a verifiable inconsistency, this hash-chaining function is believed to provide integrity for Blockchain blocks. Validation of hash-to-block and hash-to-chain computations are extremely efficient due to the selection of hash algorithms that meet the standard criteria that calculating the hash given the input is computationally easy, while reversing the hash computation is hard.

One important aspect of Blockchain security results from its distributed nature.  In order, for example, to insert a spoofed incorrect block into the chain, not only would one have to defeat the preimage resistant strength of the hash, but the insertion would have to take place across a broad distribution of the decentralized nodes [14].

1.2. Block Creation/Validation

In the Bitcoin implementation of Blockchain, the header nonce is a 32-bit number that must be determined to meet specific hash criteria. In Nakamoto’s original paper on Bitcoin [1], he proposed that PoW be established by requiring that the hash of each block start with a set of n preceding zeros.  Regardless of the other elements included in the hash input, this can always be accomplished by selecting a nonce of the proper length and value.  However, it is not possible to calculate the correct value of the nonce directly, because the hash function is not reversible.  In Bitcoin, the process of calculating the correct nonce is called “mining” and a group of distributed miners compete to calculate the correct nonce and claim a reward [14]. Effectively, the goal of a miner is to brute-force possible nonces until one miner finds a hash that meets target hash criteria in order to “claim” the block and provide their PoW for consensus consideration. Since this is a search between a space of 232 possible combinations it is a computationally difficult task but achievable with traditional computing architectures, with a primary focus on optimization of parallelized processing usually through the use of GPUs.

Note that the miner community is motivated to participate in building and maintaining the Blockchain because of the competitive rewards they receive.  They are disinterested parties relative to the content of the ledger.  One common concern in Blockchain security is what level of collusion would be required among the mining community to put illegitimate transactions into the ledger.  Assuming a uniform computational mining capability (a sort of computational arbitrage supports this assumption), >50% of nodes within the network would need to collude to be reasonably certain of success in processing illegitimate transactions into the ledger–a so-called 51% attack [14]. Returning to consideration of how difficult it would be to introduce a block based on a preimaged hash into the existing ledger, this is no longer a matter of controlling the Blockchain growth.  Assuming that the ledgered block has already been full disseminated throughout the network, then replacing an existing block with another would require a very high level of collusion or otherwise broad attack and control of nodes, and also depend on broader policies about how such discrepancies are managed when discovered (such policies aren’t part of the native Blockchain protocols as the likelihood of such an attack capability is currently vanishingly small).  Certainly, such blocks would be able to be inserted in controlled nodes in isolation temporarily and would be difficult to discover as they would not be uncovered by the hash validation protocol.

1.3. Encryption of Transaction Content

Employing a Blockchain is appropriate when the cost of creating trust through PoW is worth establishing consensus within a distributed/peer-to-peer system [15].  Some applications for the Blockchain ledger might be completely public and transparent. There have been recent proposals, for example, in various applications of the Blockchain around validation of content in public forums and in other applications that place importance on transparency of block contents.

Often at least part of the information or transaction recorded in the block requires encryption for privacy or signature to validate the identity of the individual.  This is true for cryptocurrency, for example, to protect the integrity of each digital wallet.  For Bitcoin protocols, this is done using ECDSA due to its more efficient key length relative to other available public key signatures [11].

2. Traditional Computing

Up to this point, growth of computational capability has been measurable and predictable, generally following the observation by Gordon Moore of Intel in 1965 [2] stating density of transistors on processing components would double bi-annually. With transistors acting as the main form of logic gate, accepting and outputting binary information in the form of bits, this would effectively translate to a bi-annual doubling, or at the least a predictable growth, of processing capability.

While many advancements have been made on the use of parallelized processing and enhanced digital throughput to increase the speed at which computation can be performed, the traditional computing architecture still boils down to the use of logic gates to perform transformations on a serialized and discrete stream of bits. As such, the physical limitations of this style of computation, as well as the observed rate of technological innovation, was thought to ensure that nearly all modern cryptographic algorithms would be virtually impossible to break so long as their underlying computational bases remained NP-hard [16].  While it has never been shown that the hard problems that support many information security algorithms, like factoring the product of large primes, are NP-complete for traditional computers, the level of continuing attention by the worldwide research community gives some assurance that these problems remain intractable at present.

3. Quantum Computing

Classical computational architectures are generally constructed from transistors and their fundamental units of action can be abstracted as bits. Effectively, a single transistor can measure a binary logical unit, or bit, of 0 or 1, represented by {0,1}1.

Quantum computer architectures are based on quantum-mechanical phenomena.  The quantum computing equivalent to the classical bit is a measure of computation capability called a qubit. There are a few characteristics of a qubit that make it compelling from a computational perspective when you compare it to a traditional bit.


Representations of classical bits and qubits [3]

3.1. Superposition

When observed, qubits follow quantum observational rules and collapse into a state that can be represented by a traditional binary value, in quantum computing represented by ket vectors |0> and |1>. However, prior to observation, the qubit will exist in a hidden state that is a linear superposition of its two posible observable states |𝛙> = α|0> + 𝛃|1>; it might be conceived as existing in a state that partakes of both 1 and 0 simultaneously [3], [4].

What does this mean? Direct observation of all qubits in a quantum computing algorithm would cause them to collapse to binary values, eliminating any potential quantum advantage relative to a traditional computer.  Instead, designers of quantum programs/algorithms are determining probabilistic means of measuring aspects of the interaction between qubit hidden states. Effective algorithms could mean that while traditional bits represent only binary state, a qubit could extract information or characteristics from an infinite collection of traditional bits.

3.2. Quantum Logic

Like traditional computing, quantum computing is built on fundamental logic gates. In order to retain qubit hidden state, quantum logic gates must be unitary (reversible).  This restriction, combined with the counterintuitive nature and more complex multidimensional space within which qubit computations operate, makes rationalization and development of rules of thumb for developing effective quantum algorithms difficult.  Still, a defined set of useful quantum gates and desirable states for interacting qubits have been identified, and these make up the fundamental toolbox of the quantum computer programmer [4].

4. Quantum Algorithms

4.1. Algorithm Objectives

The objective of quantum algorithms is to leverage the enhancing capability of the hidden qubit state to deliver faster computation or greater capability than is possible from a traditional computer.  In some cases it has been shown that certain believed-NP problems for traditional computers are P problems for quantum computers.  Note that according to Nielsen and Chuang [4], these proofs have so far been restricted to believed-NP problems and never demonstrated for problems which are NP-complete; they suggest that this might mean that either the believed-NP problems are actually computable on traditional computers but the appropriate algorithm has not yet been discovered, or that all NP problems are actually P problems on quantum computers.  Their logic does not seem to be exhaustive to us, although the suggestion of the former should cause some concern for security professionals based on the problems which are demonstrably solvable on a quantum computer.  Nielsen and Chuang [4] note that known algorithms which provide quantum computing advantage over traditional computers fall broadly into three different categories: (1) the quantum version of the Fourier transform, (2) quantum search algorithms, (3) simulation of quantum mechanical systems (e.g., molecular chemistry).

4.2. Shor’s Algorithm

Presented by Peter Shor of MIT in 1994 [5], Shor’s algorithm is considered the first proposed quantum algorithm that has demonstrated real-world utility.  Shor’s algorithm uses the quantum version of the Fourier transform.

Most significantly, Shor’s algorithm can be applied to reduce classes of intractable factoring problems for a traditional computer feasible for a quantum computer.  Several applications pose direct threats to the traditional bases of asymmetric cryptography:

  1. Factoring of products of large primes
  2. Calculation of discrete logarithms
  3. Factoring of point iteration over elliptic curve groups

The decomposition of the problem proceeds like so: if we consider n to be a product of q and p, there will be a sequence . If x is not divisible by p or q, this sequence will repeat with a period, and this period will evenly divide (p-1)(q-1). For example, for n = 21 we know p = 3 and q = 7, therefore (p-1)(q-1) = 12. The period of n = 21 is 6, which evenly divides 12 by 2. This means that, assuming you can find the period of n, you can reduce your search space for p and q exponentially by recognizing that (p-1)(q-1) is evenly divided by that period.

Of course, for a massive n the period can still be huge, as large as n-1 in some cases. This is where the quantum computation of Shor’s algorithm provides advantage.  It uses the quantum Fourier transform (QFT). The QFT acts as a linear transformation mapping vectors of complex numbers. While the details behind the architectural implementation are incredibly complex and vary depending on the physical medium used to perpetuate and observe the phenomena, the idea is that a string of qubits, each representing a potential period of number n, can be fed through quantum logic gates, and only the “true” period will pass all of the logic gate inspections.

A mature implementation of Shor’s algorithm could drastically reduce the effective strength of algorithms utilizing the strength of the various “unfactorable” bases by an exponential factor. These would include the most common approaches to public key cryptography, key sharing, and signing, ElGamal, Diffie-Hellman, DSA, RSA, and ECDSA.

Until recently, Shor’s algorithm was strictly theoretical; quantum computers were not large enough to factor even small integers. However, a research collaboration between MIT and University of Innsbruck developed a more efficient cascading approach to Shor’s, reducing the required number of qubits, and successfully built a 5 qubit implementation of the algorithm [7] to correctly factor the number 15 to its prime roots of 5 and 3 with 99 percent certainty.  The team has indicated that this utilization is scalable and a valuable proof of concept which will increase in capability in parallel with the number of qubits in quantum computers.

4.3. Grover’s Algorithm

Proposed by Lov Grover of the Indian Institute of Technology in Delhi in 1995, Grover’s algorithm uses the hidden quantum state of qubits to accelerate searching across N elements from  to time [4].

In 1997, Brassard, Høyer and Tapp proposed a variant  of Grover’s algorithm (BHT for the author’s initials) focused on calculating cryptographic hash collisions [8]. Their approach uses the Grover quantum search principle to improve the birthday paradox odds from O(2^n/2) to O(2^n/3).  While the improvement of BHT is not as dramatic as the exponential improvement created by Shor’s algorithm, it is important to note that many security protocols select the key length to provide assurance based on current computational capability and the traditional birthday paradox prediction of required number of bits.  Improving the odds of a collision would immediately put hashes currently assumed safe at risk [16].

5. Quantum Secure Cryptography

With all of these emerging quantum computing architectures emerging, the usage of quantum computing algorithms to attack existing cryptography is a threat many are taking seriously.

“It is now clear that current Internet security measures and the cryptography behind them will not withstand the new computational capabilities that quantum computers will bring…” - Vanee’ Vines, NSA spokesperson (2015) [9]

5.1 Lattice-Based Schemes

Wolchover [9] describes lattice-based asymmetric cryptographic schemes as being based on “how easy it is to get lost in a 500-dimensional lattice.”  The private key can be conceived as a point in high dimensional space and the public key as point(s) nearby.  The protection of the scheme is based on the difficulty of establishing anything about the private key with certainty given the public key, due to the complexity introduced by the high-dimensional space.  According to Wolchover [9], lattice-based schemes are believed to be the most promising basis for future quantum-secure public key cryptography.  Many possible protocols are currently believed to be both traditional- and quantum-computer secure.  Most are extremely forward computationally and network transmission inefficient making them poor candidates for practical applications, exceptions are based on cyclic ring implementations, like NTRU [18].  The first lattice-based algorithm that is provably secure against quantum computer attack is called Learning with Errors and was developed by Oded Regev in 2005 [10].

5.2 Learning with Errors (LWE)

Wolchover [9] presents a toy example of LWE which we analyzed and implemented in Python to develop better understand of the functionality of lattice-based public key protocols, their limitations with respect to transmission overhead, and their degree of security.  Wolchover’s toy example is presented as follows:

“Pick any odd number … [t]hat’s your private key.  Now multiply it by any other number, and add a small even number to it. … Do this many times, producing a list of enlarged, perturbed versions of your private key. This list of numbers is your public key. … Say someone wants to send you a [single bit] message … this person randomly selects half of the numbers listed in your public key and adds them together. Then, to send the message '0,' your correspondent simply sends the sum back to you. To send the message '1,' the person adds one to the sum, then sends this back … [T]o decode the message, you simply divide the sum you’ve received by your private key. If the remainder is even, the message is '0.' If the remainder is odd, the message is '1.'”

Mathematically, we can show that this encryption/decryption scheme works:

where PR is the private key, PUi is an element of the public key, M is the one-bit plaintext, C is the ciphertext, n is the number of elements in the public key, and the summation can be assumed against an arbitrary set of up to n/2 without loss of generality.

In order to better explore the known transmission overhead challenges with lattice-based encryption and LWE, we implemented the toy version in Python with PR of O(105) and expected PUi of O(106-107).  We found that when implemented as described [9], ciphertext were of O(109) hence requiring 230 for binary encryption.  We can improve on the naive approach described by recognizing that any encryption will result in a large positive number.  The bit requirement for binary encoding can be reduced by removing the bias associated with the average action of the public key in the encryption process.  Because this key list is public, no additional information is given if the average expected value of the sum of one half of the public key is subtracted from the ciphertext before transmitting.  This constant can be added by the recipient before decrypting.  The result will be a smaller (though still large) number centered on zero.  The binary encryption requirement will have to include an additional bit to allow for two’s complement encoding of both the positive and negative values.  For our case we found that we were able to reduce the encryption space requirement to 227 per one bit of plaintext.  This is still a significant overhead and illustrates the challenge of practical implementation of lattice-based cryptography [9].

There has been significant effort to find lattice-based schemes which reduce the encryption complexity to support practical implementation.  Unfortunately, authors from the British cryptographic service at GCHQ recently published work which challenges the security of many reduced-complexity lattice-based schemes to quantum attack [19].

Our toy example also illustrates the basis on which quantum security is established, although the toy is not itself secure.  For our example of 500 elements in the public key, the number of possible sums that could result, and presumably would need to be explored if each were independent, is equal to the possible combinations of 500 taken 250 at a time – for the toy example in general:

For n = 500, this equates to O(10149) or equivalently O(2495).  In general this would provide significant protection against brute-force attack.  However, for the toy example, the space spanned by the scalar summation is much smaller than the number of possible combinations.  We found that there are a high incidence of common sums within the set of combinations that all result in the same ciphertext.  The nature of the toy LWE protocol is such that no possible summations could be adjacent, so any summation of one half of the public key that results in either C or C-1 is sufficient to break the encryption by brute force.  Effective LWE of the type originally recommended by Regev is based on a large vector, rather than scalar, space, such that the probability of a collision of public key encryption combinations is also vanishingly small [10].  Unfortunately, this further contributes to protocol and transmission complexity and overhead.

6. Potential Threats to Blockchain

In a June 2017 MITRE paper “Blockchain and Quantum Computing” [13] makes several detailed claims about possible threats that quantum computing poses to Blockchain technologies. Their claims do highlight some areas where quantum algorithms seem to threaten Blockchain or supporting protocols, but our group was skeptical about the claims of resulting risk made by the paper.

Their first claim is that the biggest threat to Blockchain technology from quantum computing is the use of the Grover algorithm to find a hash collision. They claim, incorrectly, that a mature implementation of the algorithm in a quantum computer could reduce key search-space from 2k to 2k/2 and allow for a preimage computation within a usable time range for an adversary.

First, attacks against established hashes to find a useful preimage would not be accomplished using Gover’s algorithm, but rather the accelerated BHT collision detection algorithm.  This algorithm reduces the vulnerable key length of a hash by ⅔, but only for a quantum computer with processing power that is comparable to a traditional computer.  The size of a quantum computer required to execute attacks against common Shor-vulnerable public key cryptographic algorithms is smaller in terms of qubits than is required to meet a significant fraction of current traditional computing capability (Waterland [11] suggests that only 11 qubits are required for Shor attack against the ECDSA signing algorithms currently used for Bitcoin transactions).  Second, the nature of the Blockchain is such that it is biased toward validation and distribution of authentic blocks by benign miners, whose objective is to proceed to the next competitive mining opportunity as quickly as possible.  On May 1st, 2018 the average time-to-consensus for any given block in Bitcoin is 8.72 minutes [17]. Even with an incredibly powerful and mature quantum computer, preimage collision and seizure of control of the distribution network by a single malicious node is not possible during that time-period.  It would be possible to claim that a spurious preimage block is part of the chain once calculated; the hash-chain validation process of the Blockchain would not be able to dispute the claim.  However, integrity of the Blockchain is also based on distributed information.  Once the mined block has gained consensus, changing the block value at a single node would create an obvious auditable difference.  One issue with assessing this type of attack is that it falls outside of the Blockchain validation protocol.  To what extent might an attacker be able to gain control of network nodes sufficient to replace blocks with spurious preimages?  The Blockchain protocol has no process by which individual nodes would cross-validate their block values against the peer network, nor is there any defined network-wide arbitration method or auditing approach that would uncover or rationalize a distributed post-consensus preimage attack. In fact, traditional approaches to such validation use centralization or third parties; both are contrary to the intent of Blockchain technology.  The Government Accountability Office has noted that the lack of security and auditing protocols outside of the organic Blockchain processes as a risk to government implementations and intends to study the associated implications as part of its five year plan [20].

Another claim made in the paper is that quantum computers utilizing Grover’s algorithm may be used to find Proof of Work nonces for Blockchain expansion at a rate exponentially faster than nodes employing traditional computers, allowing them to seize  control of the growth of the chain, including the opportunity to insert spurious blocks/transactions.

While this is a clear application for quantum computation against the Blockchain, it hinges on an idea that adoption of quantum computing technologies would be asymmetric or that the advent of such capabilities could be kept a secret. Our group found it highly unlikely that a “quantum-leap” (pun intended) in capability could be kept secret, considering the collaborative efforts surrounding academic and industry capabilities of quantum computing that has been observed so far.  Regardless, many Blockchain ledgers are adopting consensus creation methodologies that are resistant to this form of attack.  Consensus creation requires overcoming the Byzantine generals problem [22].  The classic Blockchain option suggested by Nakamoto [1] uses Proof of Work and implementations that are vulnerable to accelerated searching via Grover’s algorithm.  An alternative family of consensus approaches is based on Proof of Stake (PoS) and requires distributed nodes to stake fixed gambles on whether a block is valid with limited information until consensus is reached [14].  Such a process is not known to be vulnerable to quantum computing attack and is being adopted by ledgers with concerns about quantum vulnerability, like Ethereum [21] and the Quantum Resistant Ledger (QRL) [12].

Finally, the MITRE paper suggests a secondary threat associated with block security not technically part of the core Blockchain protocols. The paper notes that any public-key encryption or signing of block/transaction contents using commonly employed methods could be broken by Shor’s Algorithm exposing the transaction contents.

Our group actually believes this threat is underplayed in the MITRE paper. The paper only notes the use of Blockchain for its cryptographic storage of transaction contents. What isn’t mentioned is the use of ECDSA as public-key encryption establishing ownership of, for example, Bitcoin cryptocurrency wallets themselves. If Shor’s algorithm can be used to break these signatures, then control of the wallets could be established.  All that is required for Shor’s algorithm is knowledge of the public-key of the user. According to Waterland [11], best recommended practice for Bitcoin is to use one-time signatures, yet as of November 2016, 49.6% of Bitcoin balances are held by accounts with exposed ECDSA public keys.

The QRL [11] proposes the use of a forward-secure one-time public key signing approach using an extended merkle signature scheme (XMSS).  This approach will rapidly generate large numbers of keys and of sufficient length (256 bit) that would make them resistant attack by Grover’s Algorithm.

7. Conclusion

We considered the security threats to Blockchain technology that might be potentially introduced by quantum computing capability.  Current approaches to the information security of individual Blockchain processes and components rely on maintaining sufficient complexity to stay ahead of Moore’s law advances of brute-force attacks by traditional computer architectures while relying on the likelihood that the underlying hard problems on which the security is based will remain NP.  Quantum algorithms have been shown to challenge the NP and computational complexity assumptions for key component security protocols, including standard public key cryptography/signing algorithms and security hash key lengths.  But the distributed nature of Blockchain gives it greater resiliency to quantum attack of the overall ledger when compared to the threats to individual processes or transactions that directly rely on the individual protocols vulnerable to quantum attack.

8. References

[1] Nakamoto, S. (2008). Bitcoin: A peer-to-peer electronic cash system.

[2] Moore, G. E. (1998). Cramming more components onto integrated circuits. Proceedings of the IEEE, 86(1), 82-85.

[3] Roell, J. (2018). The Need, Promise, and Reality of Quantum Computing. [online] Towards Data Science. Available at: https://towardsdatascience.com/the-need-promise-and-reality-of-quantum-computing-4264ce15c6c0 [Accessed 3 May 2018].

[4] Nielsen, M., and Chuang, I. (2010). Quantum Computation and Quantum Information. 10th Anniversary Edition. Cambridge University Press.

[5] Shor, P.W., 1999. Polynomial-time algorithms for prime factorization and discrete logarithms on a quantum computer. SIAM review, 41(2), pp.303-332, arXiv:quant-ph/9508027.

[6] Dong, C. (2018). Euler's Totient Function and Euler's Theorem. [online] Doc.ic.ac.uk. Available at: https://www.doc.ic.ac.uk/~mrh/330tutor/ch05s02.html [Accessed 3 May 2018].

[7] Chu, J. (2018). The beginning of the end for encryption schemes?. [online] MIT News. Available at: http://news.mit.edu/2016/quantum-computer-end-encryption-schemes-0303 [Accessed 3 May 2018].

[8] Brassard, G., Hoyer, P. and Tapp, A., 1997. Quantum algorithm for the collision problem. arXiv preprint quant-ph/9705002.

[9] Wolchover, N. (2018). Quantum-Secure Cryptography Crosses Red Line | Quanta Magazine. [online] Quanta Magazine. Available at: https://www.quantamagazine.org/quantum-secure-cryptography-crosses-red-line-20150908 [Accessed 3 May 2018].

[10] Regev, O. (2009) On lattices, learning with errors, random linear codes, and cryptography. Journal of the ACM. 56(6).

[11] Waterland, P. (2016). Quantum Resistant Ledger. [online] Available at: https://github.com/theQRL/Whitepaper/blob/master/QRL_whitepaper.pdf [Accessed 3 May 2018].

[12] info@theqrl.org. (2017). QRL Proof of Stake (PoS) Algorithm. [online] Available at: https://github.com/theQRL/PoS/blob/master/pos-qrl.pdf [Accessed 3 May 2018].

[13] Rodenburg, B., and Pappas, S. (2017). Blockchain and Quantum Computing. MITRE Technical Report, Project No.: 25SPI050-12.  [online] Available at: https://www.mitre.org/publications/technical-papers/blockchain-and-quantum-computing [Accessed 3 May 2018].

[14] Bashir, I. (2017). Mastering the Blockchain. Packt.

[15] Massimo, P. (2017). What Is the Blockchain? Computing in Science & Engineering. 19(5). pp. 92 - 95.

[16] Stallings, W. Cryptography and Network Security, 7th ed. Pearson.

[17] Blockchain.info. (2018). Bitcoin currency statistics. [online] Available at: https://blockchain.info/stats [Accessed 3 May 2018].

[18] Hoffstein J., et al. (1998). NTRU: A ring-based public key cryptosystem. Lecture Notes in Computer Science. 1423. Springer.


[19] Peter Campbell, P., et al. (2014). Soliloquy: a Cautionary Tale. ETSI 2nd Quantum-Safe Crypto Workshop. 6-7 Oct, Ottawa, Canada. [online] Available at: https://docbox.etsi.org/Workshop/2014/201410_CRYPTO/S07_Systems_and_Attacks/S07_Groves_Annex.pdf [Accessed 3 May 2018].

[20] Government Accountability Office (GAO). (2018). Trends Affecting Government and Society. GAO Strategic Plan, 2018-2023. [online] Available at: https://www.gao.gov/assets/700/690262.pdf [Accessed 3 May 2018].

[21] Rosic, A. (2017). What is Ethereum Metropolis: The Ultimate Guide. Blockgeeks Guide. [online] Available at: https://blockgeeks.com/guides/ethereum-metropolis/ [Accessed 3 May 2018].

[22] Lamport, et al. (1982). The Byzantine Generals Problem. ACM Transactions on Programming Languages and Systems. 4(3). pp. 382-401.

comments powered by Disqus