Skip to content

Commit 52e5b82

Browse files
Jonathan MoussaJonathan Moussa
Jonathan Moussa
authored and
Jonathan Moussa
committed
new blog post after hiatus
1 parent 0201a29 commit 52e5b82

8 files changed

+176
-15
lines changed

Gemfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ group :jekyll_plugins do
2323
end
2424

2525
# To deal with security vulnerability:
26-
gem "nokogiri", ">= 1.10.4"
26+
gem "nokogiri", ">= 1.10.8"
2727

2828
# Windows does not include zoneinfo files, so bundle the tzinfo-data gem
2929
gem "tzinfo-data", platforms: [:mingw, :mswin, :x64_mingw, :jruby]
Lines changed: 143 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,143 @@
1+
---
2+
layout: post
3+
title: "Anti-quantum algorithms"
4+
categories: research
5+
---
6+
7+
It has been about 6 months since my last blog post.
8+
I have been busy since then with various projects, both associated with my job and my personal research program,
9+
but I haven't had any bloggable results until now.
10+
I had intended to start blogging more on thoughts and opinions that aren't connected to research results,
11+
but my thoughts got caught up in a slow-burning existential career crisis that I didn't really want to blog about until they had converged.
12+
I will be discussing these thoughts in my next blog post in the near future.
13+
14+
I'd also like to briefly take stock of how what I've actually been up to matches the plans that I articulated in my last blog post.
15+
I noted a queue of 4 planned papers.
16+
The first paper was an electronic structure application of the Cauchy kernel paper that was the subject of my last post.
17+
I ended up putting that on hold,
18+
but I will now be finishing that up for the SIAM Annual Meeting in July (I was invited to present a talk and hopefully the conference won't be cancelled).
19+
The second paper was a final revision of the [quantum Metropolis algorithm paper](https://arxiv.org/abs/1903.01451) that I presented at the 2019 APS March Meeting.
20+
I heavily revised the paper and improve it substantially,
21+
but I uncovered an important technical problem at the end of the revision process
22+
(the repeat-until-success loop of the algorithm has a fat tail of success probabilities that can cause a divergence in average run times)
23+
that will necessitate yet another major revision later this year.
24+
The third paper was a very old project on designing molecules with extremely low ionization potentials,
25+
which I resumed technical work on but had to pause as the 2020 APS March Meeting grew closer.
26+
The fourth paper is the subject of this blog post.
27+
As suggested by this mismatch between intent and outcomes,
28+
scientific research is often fraught by large uncertainties and frequent delays.
29+
30+
This post is mainly about my latest preprint, ["Robust decompositions of quantum states"](https://arxiv.org/abs/2003.04171),
31+
which I was planning to speak about at the 2020 APS March Meeting before it was cancelled because of the global coronavirus outbreak.
32+
I'm trying to make the best of this situation, and I've instead [recorded my talk on YouTube](https://youtu.be/sUka7hj5e_E).
33+
Perhaps it will eventually reach a wider audience than it would have at the conference.
34+
Indirectly, this project has an exceptionally long history.
35+
I started trying to develop new computational methods for simulating quantum many-body systems before I even started graduate school in 2002.
36+
Like many of my research passions, I had a lot of enthusiasm, lofty goals, and high standards, which resulted in a steady stream of failed ideas for years on end.
37+
I was particularly obsessed with trying to solve the 2D Hubbard model,
38+
which was and still is an important, unsolved model of strong electron correlations and possibly electron-mediated superconductivity.
39+
My modus operandi at the time was to focus on whatever methodological idea seemed promising at the moment,
40+
mess around with it and stress test it in some way until it inevitably broke,
41+
and then brainstorm up a new idea to focus on that was informed by the past failures.
42+
At times, the churn on this approach became super high, and ideas sometimes wouldn't last more than a day.
43+
Those were fun times, but I wasn't able to produce papers from this process, which I eventually acknowledged to be a serious career problem.
44+
As a post-doc, I decided to commit to one last, best idea for treating strong quantum correlations that only produced [an incomplete preprint](https://arxiv.org/abs/1003.2596)
45+
after the preliminary numerical results that I generated turned out to be absolutely terrible.
46+
After that, I stopped working on strong-correlation methods and started working on more modest weak-correlation methods based on the random-phase approximation (RPA).
47+
My RPA work was a lot more successful, and I ended up with [a published paper](https://doi.org/10.1063/1.4855255) that I am still quite proud of.
48+
Unfortunately, I had to stop working on RPA methods when my funding for electronic structure research at Sandia National Labs suddenly dried up,
49+
and I was pushed into working on quantum computing research instead.
50+
51+
Working on quantum computing research at Sandia National Labs was a frustrating experience.
52+
I was very interested in developing simulation methods for quantum systems,
53+
but funding for electronic structure research in very pragmatic and heavily applied environments like Sandia and the other DOE weapons labs (Livermore and Los Alamos)
54+
has been on a long slow-and-steady decline because such methods just don't have enough practical value.
55+
One of the big rationales for quantum computing, both at Sandia and more broadly,
56+
is to improve our ability to simulate quantum systems,
57+
but it is absolutely bizarre, counterintuitive, and counterproductive that we would be increasing our investment in developing
58+
quantum algorithms for quantum computers that we don't have yet while steadily decreasing our investment in developing classical algorithms
59+
for classical computers that we have plenty of and are arguably using very inefficiently.
60+
As a trendy research topic, quantum computing gets the benefit of the doubt, while the older and no-longer-trendy topic of electronic structure doesn't anymore.
61+
It is especially bizarre given that researchers generally agree that
62+
(1) a lot of methodology is shared, so that improvements to classical algorithms will have residual benefits for quantum algorithms,
63+
(2) quantum computers will likely supplement or accelerate simulations that mostly occur on classical computers, and
64+
(3) the benefits of quantum computers for quantum simulation are asymptotic in accuracy and system size,
65+
so they might not have any benefits at all for the accuracies and system sizes that people normally deal with.
66+
While I learned about quantum algorithm development and made numerous attempts at developing new quantum algorithms at Sandia,
67+
I also tried my best to inject classical algorithm development wherever I could get away with it.
68+
This was also an excuse to return to research in strong-correlation methods, now considering a broader set of tools and ingredients
69+
including classical algorithms, both deterministic and stochastic, and quantum algorithms.
70+
71+
I did not have a clear agenda or program of research when I started working in quantum computing.
72+
At first, I spent a lot of time just learning the subject and catching up with several decades of research.
73+
Eventually, after a period of more exploratory efforts, I settled into a program of quantum computing research
74+
that I have now carried with me beyond my time at Sandia.
75+
The premise is that someday, once we finally have large digital quantum computers in a few decades or so,
76+
we will carry out atomistic simulations containing both classical and quantum degrees of freedom
77+
that are distributed over classical and quantum computers.
78+
I am interested in delineating these classical/quantum boundaries in both simulation and computation:
79+
(1) what degrees of freedom should be treated classically, semi-classically, or fully quantum mechanically and
80+
(2) how should we partition the simulation between classical and quantum computers and how should they be coupled together?
81+
My approach to resolving these boundaries right now is a two-pronged attack
82+
aimed at developing better classical algorithms to push on the classical side of the boundary
83+
and better quantum algorithms to push on the quantum side of the boundary,
84+
so that the boundary is resolved evenly from both sides.
85+
86+
The quantum side of this research program is much farther along, as it was developed first.
87+
For now, it mainly consists of two parts: quantum error correction (QEC) algorithms and quantum Metropolis algorithms.
88+
The overall cost of quantum computing depends on multiple layers: the underlying physical qubit technology,
89+
the QEC layer, and the algorithms layer.
90+
For reasons of long-term career stability, I am very loath to commit to a specific physical qubit technology,
91+
so I've instead focused on the other two layers.
92+
I wrote several papers on QEC at Sandia, and I find the topic to be very interesting, but it just isn't a very popular research topic right now.
93+
While I have a lot of ideas for future projects, my QEC research has been on hold for several years now,
94+
and I don't plan to return to it until a logical qubit has been experimentally realized.
95+
My interest in QEC is now more about patenting essential components of future quantum computers rather than purely academic curiosity,
96+
and there is no point in starting a patent clock in a world without logical qubits.
97+
Instead, I've been working on quantum algorithms for preparing thermal states.
98+
This started at Sandia, where I tried to develop a quantum analog of the Langevin thermostat.
99+
The development of a quantum Langevin thermostat turned into a horrible technical slog (that I might eventually recount in this blog),
100+
but it eventually evolved into the development of a quantum Metropolis algorithm that I released a [preprint](https://arxiv.org/abs/1903.01451)
101+
on shortly before starting this blog.
102+
The quantum Metropolis algorithm is very promising but still has a few outstanding technical problems,
103+
and I expect to finish it and release a final version of the paper later this summer.
104+
105+
The classical side of this research program also started at Sandia, but it was much slower to develop.
106+
The basic premise of the project was that noise and uncertainty in quantum systems must inevitably drive them to some kind of classical limit
107+
that is described by classical physics and is efficient to simulate on classical computers.
108+
As quantum systems are driven towards that limit, they should become easier to simulate on classical computers even while retaining much of their quantum identity.
109+
There are some very nice results in quantum information theory that embody this concept very clearly.
110+
Perhaps the best example are quantum stabilizer circuits (Clifford gates with qubit preparation and measurement operations)
111+
that include T gates (a non-Clifford gate that enables universal quantum computation) with polarizing noise.
112+
When the noise in the T gates is smaller than a known threshold,
113+
the noise can be "distilled" away to a negligible amount using a process known as magic state distillation,
114+
which is an important ingredient in many plans for digital quantum computers.
115+
As the threshold is approached, the efficiency of magic state distillation drops to zero (a diverging number of noisy T gates is needed to distill out a low-noise T gate).
116+
When the noise is at or above the threshold, these circuits can be efficiently simulated on a classical computer
117+
and are no longer capable of universal quantum computation.
118+
While this is a very compelling result, it has very little to do with physical quantum devices or any applications other than QEC.
119+
With a lot of popular interest in using noisy quantum devices for quantum simulation tasks without any QEC,
120+
I wanted to develop a comparable result that was relevant to simulation (i.e. showing that noise made quantum simulation tasks easier for classical computers).
121+
For a long time, this project was too hung up on trying to model specific instances of analog quantum simulators,
122+
but realistic noise is messy and does not directly make systems easier to simulate.
123+
In fact, a lot of experimental noise is actually very difficult to model and simulate,
124+
and makes some noisy quantum systems even harder to simulate than their idealized versions.
125+
I finally started to make progress when I gave up on realistic physical noise
126+
and decided to design noise that was tailored to making quantum systems easier to simulate.
127+
This noise was based on qubit measurements, which introduced a stochastic component that effectively made it a type of quantum Monte Carlo method.
128+
I had always discounted stochastic algorithms in my previous attempts at developing strong-correlation solvers
129+
because of the inherently high cost of reducing sampling errors,
130+
but here it was natural and inevitable in mimicking the capabilities of quantum computing hardware.
131+
132+
In a way, the development of classical algorithms to simulate quantum systems by introducing noise that saps them of their inherent quantum computing power
133+
is kind of like developing "anti-quantum" algorithms, at least for the purposes of a catchy title for a blog post.
134+
As I will discuss more in my next blog post, I now feel a strong need to consolidate my research into a more narrowly focused technical path.
135+
My end goal on the timescale of several years is still to develop a new generation of semiempirical models,
136+
but that research area is extremely unpopular now (making it impossible to get any sort of support for it or career benefit from it)
137+
and in need of a major technical overhaul.
138+
Right now, I believe that further developing the results of this project, both the theory and software implementation,
139+
will both help with the technical overhaul of semiempirical models (by distinguishing between Hamiltonian model errors and correlation model errors)
140+
and be a direct benefactor of it (by gaining access to simple model Hamiltonians that are quantitative representations of real molecules and materials).
141+
I'm trying to balance my research program so that it aligns as much as possible with what I want to work on and what I believe to be the best science that I am capable of doing
142+
while also maintaining connections to popular research topics that give me a better chance of eventually finding financial support to sustain a research career
143+
that is presently in a slow but inevitable decline.

assets/2019-08-06-MNDO-He-dimer.ipynb

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -654,6 +654,18 @@
654654
"display_name": "Python 3",
655655
"language": "python",
656656
"name": "python3"
657+
},
658+
"language_info": {
659+
"codemirror_mode": {
660+
"name": "ipython",
661+
"version": 3
662+
},
663+
"file_extension": ".py",
664+
"mimetype": "text/x-python",
665+
"name": "python",
666+
"nbconvert_exporter": "python",
667+
"pygments_lexer": "ipython3",
668+
"version": "3.7.4"
657669
}
658670
},
659671
"nbformat": 4,

assets/CV.aux

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@
2525
\@writefile{toc}{\contentsline {section}{Publications}{2}{section*.7}}
2626
\gdef\etaremune@i{23}
2727
\@writefile{toc}{\contentsline {section}{Unpublished Work}{3}{section*.8}}
28-
\gdef\etaremune@ii{10}
28+
\gdef\etaremune@ii{12}
2929
\gdef\etaremune@iii{1}
3030
\gdef\etaremune@iv{5}
3131
\@writefile{toc}{\contentsline {section}{Awarded Grants}{4}{section*.9}}

assets/CV.log

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
This is pdfTeX, Version 3.14159265-2.6-1.40.19 (TeX Live 2018) (preloaded format=pdflatex 2018.4.16) 31 JUL 2019 11:02
1+
This is pdfTeX, Version 3.14159265-2.6-1.40.19 (TeX Live 2018) (preloaded format=pdflatex 2018.4.16) 5 OCT 2019 19:06
22
entering extended mode
33
restricted \write18 enabled.
44
file:line:error style messages enabled.
@@ -372,27 +372,27 @@ Overfull \hbox (5.19522pt too wide) in paragraph at lines 395--398
372372
[]
373373

374374

375-
Overfull \hbox (18.81572pt too wide) in paragraph at lines 411--413
375+
Overfull \hbox (18.81572pt too wide) in paragraph at lines 417--419
376376
[]\OT1/cmr/bx/n/10 Moussa, J. E.\OT1/cmr/m/n/10 . Measurement-Based Quan-tum Me
377377
tropo-lis Al-go-rithm. [][]arXiv:1903.01451[][]
378378
[]
379379

380380
[3] [4]
381381
AED: lastpage setting LastPage
382382
[5]
383-
Package atveryend Info: Empty hook `BeforeClearDocument' on input line 543.
384-
Package atveryend Info: Empty hook `AfterLastShipout' on input line 543.
383+
Package atveryend Info: Empty hook `BeforeClearDocument' on input line 549.
384+
Package atveryend Info: Empty hook `AfterLastShipout' on input line 549.
385385
(./CV.aux)
386-
Package atveryend Info: Executing hook `AtVeryEndDocument' on input line 543.
387-
Package atveryend Info: Executing hook `AtEndAfterFileList' on input line 543.
386+
Package atveryend Info: Executing hook `AtVeryEndDocument' on input line 549.
387+
Package atveryend Info: Executing hook `AtEndAfterFileList' on input line 549.
388388
Package rerunfilecheck Info: File `CV.out' has not changed.
389389
(rerunfilecheck) Checksum: EA64E58B17EAB629346B2DA401AA005D;810.
390-
Package atveryend Info: Empty hook `AtVeryVeryEnd' on input line 543.
390+
Package atveryend Info: Empty hook `AtVeryVeryEnd' on input line 549.
391391
)
392392
Here is how much of TeX's memory you used:
393393
5851 strings out of 492649
394394
86837 string characters out of 6129622
395-
190462 words of memory out of 5000000
395+
189462 words of memory out of 5000000
396396
9683 multiletter control sequences out of 15000+600000
397397
5200 words of font info for 19 fonts, out of 8000000 for 9000
398398
1141 hyphenation exceptions out of 8191
@@ -407,10 +407,10 @@ sr/local/texlive/2018/texmf-dist/fonts/type1/public/amsfonts/cm/cmr10.pfb></usr
407407
cal/texlive/2018/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy10.pfb></usr/loc
408408
al/texlive/2018/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy7.pfb></usr/local
409409
/texlive/2018/texmf-dist/fonts/type1/public/amsfonts/cm/cmti10.pfb>
410-
Output written on CV.pdf (5 pages, 141002 bytes).
410+
Output written on CV.pdf (5 pages, 141279 bytes).
411411
PDF statistics:
412-
193 PDF objects out of 1000 (max. 8388607)
413-
174 compressed objects within 2 object streams
412+
195 PDF objects out of 1000 (max. 8388607)
413+
176 compressed objects within 2 object streams
414414
22 named destinations out of 1000 (max. 500000)
415415
129 words of extra memory for PDF output out of 10000 (max. 10000000)
416416

assets/CV.pdf

277 Bytes
Binary file not shown.

assets/CV.synctex.gz

772 Bytes
Binary file not shown.

assets/CV.tex

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -408,16 +408,22 @@ \section{Publications} \begin{bibsection}
408408

409409
\section{Unpublished Work} \begin{bibsection}
410410

411+
\item \textbf{Moussa, J. E.}. Minimax separation of the Cauchy kernel.
412+
\href{https://arxiv.org/abs/1909.06911}{arXiv:1909.06911} (2019).
413+
414+
\item Metcalf, M., \textbf{J. E. Moussa}, W. A. de Jong, M. Sarovar. Engineered thermalization of quantum many-body systems.
415+
\href{https://arxiv.org/abs/1909.02023}{arXiv:1909.02023} (2019).
416+
411417
\item \textbf{Moussa, J. E.}. Measurement-Based Quantum Metropolis Algorithm.
412418
\href{https://arxiv.org/abs/1903.01451}{arXiv:1903.01451} (2019).
413419

414-
\item L. Shulenburger, A. D. Baczewski, S. M. Foiles, A. E. Wills, N. A. Modine, \textbf{J. E. Moussa}, P. A. Schultz, V. Tikare, and A. F. Wright.
420+
\item Shulenburger, L., A. D. Baczewski, S. M. Foiles, A. E. Wills, N. A. Modine, \textbf{J. E. Moussa}, P. A. Schultz, V. Tikare, and A. F. Wright.
415421
Next-Generation Electronic Structure Codes. Sandia Technical Report SAND2016-9782 (2016).
416422

417423
\item \textbf{Moussa, J. E.}. Linear embedding of free energy minimization.
418424
\href{https://arxiv.org/abs/1603.05180}{arXiv:1603.05180} (2016).
419425

420-
\item Metodi, T. S. , A. J. Landahl, C. Ryan-Anderson, M. S. Carroll, \textbf{J. E. Moussa}, and R. P. Muller.
426+
\item Metodi, T. S., A. J. Landahl, C. Ryan-Anderson, M. S. Carroll, \textbf{J. E. Moussa}, and R. P. Muller.
421427
SEQIS Late Start LDRD: Final Report - Robust Quantum Operations. Sandia Technical Report SAND2015-10754 (2015).
422428

423429
\item \textbf{Moussa, J. E.} and A. D. Baczewski. Comment on ``Self-Averaging Stochastic Kohn-Sham Density-Functional Theory''.

0 commit comments

Comments
 (0)