Entering The Next Era of Drug Discovery, Thanks to VQEs.

A Walkthrough of VQEs, Their Impact and an Annotated Copy of My Code

Priyal Taneja
9 min readFeb 10


“Why is there no cure yet?”

That’s always my first thought whenever I come across a heart-wrenching story of someone facing cancer, dementia, or any other debilitating illness without a cure. After all, these diseases have been the focus of attention for a while now. They’ve had billions of dollars and countless hours invested in the pursuit of a cure.

When I tried to find an answer for my question, I came across an alarming result; it takes a minimum of 10 years and ~2.6 billion dollars to research and develop a successful drug.

Developing drugs is all about small molecules and macromolecules and the result of the interactions between them. Of course, the goal is to have interactions that don’t impose a negative reaction on our bodies.

Simulating molecules is an important part of the whole drug development process, as it can help predict the behaviour and properties of potential drugs prior to being synthesized and tested in labs — which saves on a whole load of time and resources. It can also help us determine certain physical and chemical properties, which is key because we’re able to easily determine the best formulation and dosing regimen for the drug.


..Yes, there’s a but.

Simulation is a difficult problem for our conventional computers. If you had 1 atom, you’d have to keep track of every single proton, neutron, and electron… which is do-able. Except, when the number of atoms exponentially increase, it gets pretty difficult. That’s a whole load of data, that neither us, as humans, or our computers can remember. And, inaccurate simulations that take forever to form aren’t really helpful.

Quantum computers are. They can perform simulations of molecular interactions at a much faster rate than classical computers. This allows researchers to quickly analyze potential drug candidates and identify the best options for further development — which saves billions of dollars on research for drugs that may have minimal potential and all the unnecessary time.

Enter Variational Quantum Eigensolvers…

A crucial part of analyzing molecules is determining the ground state of its electrons — that can provide us with information on its stability, reactivity, and binding affinity to target proteins. But, I’m sure you already guessed it. The number of possible configurations of the electrons in a molecule increase exponentially with the size of the molecule, and that’s too much data for our classical computers to analyze.

But, the key to quantum computers being this powerful tool in determining the ground state of an electron is something called a Variational Quantum Eigensolvers. VQEs are quantum algorithms for finding the lowest eigenvalue of a Hermitian operator, which represents the energy of a quantum system. And, molecules are quantum systems — after all, both are viewed on an atomic scale.

VQEs use the variational principle (from quantum mechanics) to determine the eigenvalue of the energy of a quantum system.

Stick with me while I break down the variational method equation and explain some of the beautiful math behind VQEs… it’s fun I promise.

First, we need to know that molecules are represented in the form of a matrix for computers. And, given that it’s a Hermitian matrix, it’s also equal to its conjugate transpose. What we’re trying to do is find the smallest eigenvalue of this Hermitian matrix.

The main takeaway from this is that we are able to use the spectral theorem which allows us to state that the eigenvalues of the matrix must be real number. Anyways, this Hermitian matrix can be used to describe the Hamiltonian of the quantum system, which specifies its total energy.

Mathematically it’s represented as…

…where λ¡ is the Eigenvalue and |Ψ¡⟩ is the Eigenvector of the matrix.

The Hamiltonian can also be represented through this equation:

Now that we have two equations that are equivalent to our hamiltonian (H), we can substitute the first equation into the second.

The expectation value of each observable state is given by the last equation in our rearrangement. And, since |⟨Ψ¡⟩⟨Ψ⟩|² is equal to or greater than 0 (as the eigenvalues have to be real numbers)…:

This final equation represents the variational principle or variational method that we were talking about. It is equal or less than the eigenvalue of the Hamiltonian and tells us the expectation value of any wave function. It is the fundamental equation behind the workings of VQEs.

Let’s Code a VQE to Find the Ground State of a Triatomic Hydrogen Ion 👩🏻‍💻

Feel Free to Access my GitHub Repository and See the Code in Entirety.

Step One: Finding the Molecular Hamiltonian

As usual, we’re going to start off by importing all the required libraries to ensure we have the functions needed to code this project. These are the backbones of our code so make sure all necessary libraries are imported.

# import all required libraries
import pennylane as qml
import matplotlib.pyplot as plt
from pennylane import numpy as np
from pennylane import qchem

In order to find the ground energy level, we need to start from scratch and define the molecule that we’re working with. For this particular example, we’re going to find the ground state of an triatomic hydrogen ion (H+), so I’ve declared each of the symbols along with the geometry of the molecule.

# define the molecule
symbols = ["H", "H", "H"]
coordinates = np.array([[0.0102,0.0442,0.0],[0.9867,1.6303,0.0],

*The coordinates were obtained from this database

And now, we finally determine the value of the Hamiltonian. At the same time, we’re also figuring out how many qubits are going to be used for this program. For the code below, the Hamiltonian value is (-0.29975061450095497) and the number of qubits needed are 6.

# define the molecular hamiltonian
hamiltonian, qubits = qchem.molecular_hamiltonian(symbols,
charge = 1)
print ("Number of qubits needed =", qubits)
print ("The Hamiltonian is", hamiltonian)

After this, we’re going to define an approximate ground-state which is also called the Hartree-Fock State. This is our first approximation guess towards our ground state, but isn’t actually the true value.

# determine the hartree-fock state
hf = qchem.hf_state(electrons = 2, orbitals = 6)
print (hf)

The output of the code above is represented through the Jordan Wegner representation, and ⟨110000⟩ tells us that there is one spin-up electron and one spin-down electron in the first energy state. Now, we’ll be building the circuit that will give us our expectation value with the Hartree-Fock state.

# calculate the expectation value
num_wires = qubits
dev = qml.device("default.qubit", wires=num_wires)

def exp_energy(state):
qml.BasisState(np.array(state), wires=range(num_wires))
return qml.expval(hamiltonian)


The result of -1.24655016 from exp_energy(hf) doesn’t provide us with the ground state but it does provide us with a minimized, lower state that we can continue iterating on to find the smallest value.

Step Two: Prepare Trial Ground State

Now, we’ll be defining our circuit that will de-excite the electrons through double excitation gates, which allows our system to determine a second expectation value that is lower than the first.

The first Double Excitation gate is applied on the first element in the list of parameters and takes the electrons between the lowest energy to the intermediate level of energy. The second Double Excitation gate is applied on params [1] instead and takes electrons in the lowest energy level to the highest energy level.

# de-excite electrons through double excitation gates
def ansatz(params):
qml.DoubleExcitation(params[0], wires=[0,1,2,3])

Here, we’ve prepared a second expectation value that can be considered a trial ground state.

Step Three: Minimize ⟨H⟩

Remember the fundamental variational principle equation we determined earlier? Here’s when it comes into play.

brought it back as a refresher

This property tells us that if we find a state that minimizes the expectation value of the Hamiltonian, then that state is the ground state. So, now we’re going to move one step closer to our minimum and determine a cost function to lower the current expectation value we have.

# minimize the hamiltonian value using ritz variational principle
def cost_function(params):
return qml.expval(hamiltonian)

In this case, we get an energy of -1.26796721, which is smaller than our Hartree-Fock state… so, yay. We’re progressing towards success. We’ve already found a state that is lower of what was supposed to be the ground state.

Finally, we’ll be optimizing our cost function to minimize our expectation value. This is done through optimizing quantum circuits through paneling. A reiterative process is completed with the state getting lower each time, and every 2 steps are outputted. For H+ the maximum iterations are 20, but the number can be smaller or greater depending on the molecule.

# optimize our cost function circuit to find ground-state
opt = qml.GradientDescentOptimizer(stepsize=0.4)
theta = np.array([0.0,0.0], requires_grad = True)

energy = [cost_function(theta)]
angle = [theta]
max_iterations = 20

for n in range(max_iterations):
theta, prev_energy = opt.step_and_cost(cost_function, theta)


if n%2 == 0:
print (f"Step = {n}, Energy = {energy[-1]:.8f} Ha")

# output the ground-state and the optimizing parameters
print ("\n" f"Final ground energy: {energy[-1]: .8f} Ha")
print ("\n" f"Final angle parameters: {theta[0]: .8f} {theta[1]: .8f}")
results from the optimization circuit

As we can see, the energy states slowly progress lower and lower each time, and step 18 is the last value before the loop starts outputting a repetitive number. The final value converged is ~1.27 and is much lower than the Hartree-Fock state of ~1.24.

And, to end off the determination of our ground state, we’ll create a graph that will visually represent the position of the ground state.

# form a graphical representation with the ground state plotted
fig = plt.figure()

# full configuration interaction energy computed classically
E_fci = -1.27443763

# add energy plot on column 1
ax1 = fig.add_subplot(121)
ax1.plot(range(n + 2), energy, "go", ls="dashed")
ax1.plot(range(n + 2), np.full(n + 2, E_fci), color="red")
ax1.set_xlabel("Optimization step", fontsize=13)
ax1.set_ylabel("Energy (Hartree)", fontsize=13)
ax1.text(0.5, -1.2466, r"$E_\mathrm{HF}$", fontsize=15)
ax1.text(0, -1.2744, r"$E_\mathrm{FCI}$", fontsize=15)

# add angle plot on column 2
ax2 = fig.add_subplot(122)
ax2.plot(range(n + 2), angle, "go", ls="dashed")
ax2.set_xlabel("Optimization step", fontsize=13)
ax2.set_ylabel("Gate parameter $\\theta$ (rad)", fontsize=13)

plt.subplots_adjust(wspace=0.3, bottom=0.2)

Closing Thoughts 💭

We’re in a period of time where researchers are finally discovering the impact of quantum computers for drug discovery and research, which includes VQEs. They’re truly one of the most promising methods for quantum chemistry, as they have an exponential advantage over classical algorithms. Of course, limitations, such as noise, exist but the future of quantum computation in chemistry could change the way we approach drug development process.

Hey, I’m Priyal, a 16-year-old who’s ambition revolves around working on solving complex problems to meaningfully contribute to the world. If you enjoyed my article or learned something new, feel free to subscribe to my monthly newsletter to keep up with my progress in quantum computing exploration and an insight into everything I’ve been up to. You can also connect with me on LinkedIn and follow my Medium for more content! Thank you so much for reading.



Priyal Taneja

​a 17 year old who’s ambition revolves around working on solving complex problems to meaningfully contribute to the world https://priyaltaneja.typedream.app/