I am pleased to announce that my quantum simulator Qubiter (available at GitHub, BSD license) now has a native TensorFlow Backend-Simulator (see its class `SEO_simulator_tf`, the `tf` stands for TensorFlow). This complements Qubiter’s original numpy simulator (contained in its class `SEO_simulator`). A small step for Mankind, a giant leap for me! Hip Hip Hurray!
This means that Qubiter can now calculate the evolution of a state vector using CPU, GPU or TPU. Plus it can do back-propagation on a quantum circuit. Here is a jupyter notebook that I wrote that uses Qubiter’s TF backend to do VQE (Variational Quantum Eigensolving). (I like to call VQE, mean Hamiltonian minimization)
https://github.com/artiste-qb-net/qubiter/blob/master/jupyter-noteb…
Numpy is a tensor library in Python and TensorFlow (produced by Google) is another tensor library in Python. TensorFlow matches by name and functionality, almost 1 to 1, every function in numpy. But the TF namesake functions are much more powerful than their numpy counterparts. Besides accessing the usual CPU, they can access distributed computing resources like GPU and TPU. Furthermore, they can be asked to “tape” a history of their use, and then to replay that history in reverse, back-propagation mode so as to calculate derivatives of a list of user-designated variables. These derivatives can then be used to minimize a cost (aka loss or objective) function. Such minimizations are the bread and butter of classical and quantum neural nets.
These are exciting times for TF:
- Just last March, Google released officially TF version 2.0.
- In TF 2.0, the Eager mode (which is the mode that Qubiter’s TF backend uses) has been elevated to default. TF Eager uses dynamical graphs like the TF competitor, PyTorch, produced by Facebook, uses.
- TF 2.0 also incorporates “Edward”, a software lib by Dustin Tran for doing calculations with Bayesian Networks. Edward in its TF incarnation is called TF Probability. Plus TF 2.0 is also incorporating Keras (a library for doing layered neural nets) and PyMC (a lib for doing Markov Chain Monte Carlo calculations)
- Anyone can run TF 2.0 on the cloud via Google’s Colab. That is, in fact, how I developed and tested Qubiter’s TF backend.
In theory, all you have to do to convert a numpy simulator to a tensorflow one is to change all functions that start with the prefix `np.` by their namesakes that start with the prefix `tf.`. However, the devil is in the details. I, like most good programmers, hate repeating code. I try to avoid doing so at all costs, because I don’t want to have to correct the same bug in multiple places. So I built class `SEO_simulator_tf` by sub-classing the original numpy simulator class `SEO_simulator`. That is one of the beauties of “sub-classing” in OOP (Object Oriented Programming), of which I am a great fan: sub-classing helps you avoid repeating a lot of code. So `SEO_simulator_tf` ended up being a deceivingly short and simple class: all it does is basically turn on a bunch of flags that it passes to its parent class `SEO_simulator`. Most of the new code is in the parent class `SEO_simulator` and other classes like `OneBitGates` which have been modified to respond to all those flags that `SEO_simulator_tf` sets.