Machine Learning with Quantum Computers
I recently watched a Google Tech Talk with Eric Ladizinsky who visited the Quantum AI Lab at Google to talk about his D-Wave quantum computer. The talk is called Evolving Scalable Quantum Computers and is great, I highly recommend it.
I’ve had quantum computing on my mind and another tech talk went by titled Quantum Machine Learning and I had to jump on it. The talk is by Seth Lloyd from MIT.
The talk starts out with a quick overview of quantum mechanics. He gives a mind bending example of a CD that holds about 10 billion bits that can be turned into a quantum state with a single photon. The problem is it’s not in a form that you can readily manipulate. This problem of developing natural systems that can innately quantum compute is one area that interests him.
The heart of the talk is the description of how to perform classical linear algebra operations using a quantum computer in order to get an exponential (logarithmic) speedup. This is desirable because simple (although computationally expensive) vector operations underlie a lot of computer science in general and machine learning algorithms specifically.
The quantum versions of linear algebra operations Seth focuses on are:
- Lloyds algorithm that underlies KMeans
- Fourier Transform
- Inversion for sparse matrices
- Support Vector Machines (find support vectors, mapping new points)
- Manifold learning (finding holes and connected components)
He comments that you don’t get everything for free, it is taking serious work for them to map useful algebra into the crazy world of quantum mechanics. The explanations he offers appear quite intuitive (he’s a good communicator), although I expect they are deceptively complex once you step into the detail. It’s not really my area.
Patrick Rebentrost came up with the notion that machine learning and quantum mechanics are fundamentally about manipulating large numbers of vectors in high dimensional spaces and about bringing these two fields together. The key paper on these ideas are Quantum algorithms for supervised and unsupervised machine learning and Quantum support vector machine for big feature and big data classification, both from 2013.
相關推薦
Machine Learning with Quantum Computers
Tweet Share Share Google Plus I recently watched a Google Tech Talk with Eric Ladizinsky who vis
OReilly.Hands-On.Machine.Learning.with.Scikit-Learn.and.TensorFlow學習筆記彙總
其中用到的知識點我都記錄在部落格中了:https://blog.csdn.net/dss_dssssd 第一章知識點總結: supervised learning k-Nearest Neighbors Linear Regression
Hands-on Machine Learning with Scikit-Learn and TensorFlow(中文版)和深度學習原理與TensorFlow實踐-學習筆記
監督學習:新增標籤。學習的目標是求出輸入與輸出之間的關係函式y=f(x)。樸素貝葉斯、邏輯迴歸和神經網路等都屬於監督學習的方法。 監督學習主要解決兩類核心問題,即迴歸和分類。 迴歸和分類的區別在於強調一個是連續的,一個是離散的。 非監督學習:不新增標籤。學習目標是為了探索樣本資料之間是否
Introduction to Machine Learning with Python/Python機器學習基礎教程_程式碼修改與更新
2.3.1樣本資料集 --程式碼bug及修改意見 import matplotlib.pyplot as plt import mglearn X,y=mglearn.datasets.make_forge() mglearn.discrete_scatter(X[:,0
【Machine Learning with Peppa】分享機器學習,數學,統計和程式設計乾貨
專欄達人 授予成功建立個人部落格專欄
Machine Learning with Peppa
把Scala List的幾種常見方法梳理彙總如下,日常開發場景基本上夠用了。建立列表scala> val days = List("Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Sat
spark機器學習 原始碼 Machine Learning With Spark source code
@rover這個是C++模板 --胡滿超 stack<Postion> path__;這個裡面 ”<> “符號是什麼意思?我在C++語言裡面沒見過呢? 初學者,大神勿噴。
Combining Machine Learning with Credit Risk Scorecards
With all the hype around artificial intelligence, many of our customers are asking for some proof that AI can get them better results in areas where other
Machine Learning with Kaggle Kernels
In the last article we introduced Kaggle's primary offerings and proceeded to execute our first "Hello World" program within a Kaggle Kernel. In this artic
Machine Learning with Time Series Data
As with any data science problem, exploring the data is the most important process before stating a solution. The dataset collected had data on Chicago wea
Machine learning with CloudCoins
A few months ago I wrote about how we built our wellness app experiment called "Kubecoin". It's a side project, a type of hobby app that we built to take t
Nvidia looks to transform machine learning with GPUs
Nvidia is no stranger to data crunching applications of its GPU architecture. It's been dominating the AI deep learning development space for years and sat
Machine Learning with GPUs on vSphere
Performance of Machine Learning workloads using GPUs is by no means compromised when running on vSphere. In fact, you can often achieve better aggregate pe
Introduction to Machine Learning with IBM Watson Studio
After logging into Watson Studio, select New Modeler Flow. Enter a name, keep the default settings, and then click Create. Next expand the Import menu, dra
Removing Obstacles to Production Machine Learning with OpnIDS and Dragonfly MLE
Machine learning promises to address many of the challenges faced by network security analysts; however, there are still many obstacles that prevent widesp
Book Review: Machine Learning with Python Cookbook
Additional Considerations The only criticism I can place is that I wish there were more topics covered in the content. Some specific areas I would have li
Attacking Machine Learning with Adversarial Examples
Adversarial examples are inputs to machine learning models that an attacker has intentionally designed to cause the model to make a mistake; they're like o
[Machine Learning with Python] Cross Validation and Grid Search: An Example of KNN
Train model: from sklearn.model_selection import GridSearchCV param_grid = [ # try 6 (3×2) combinations of hyperparameters {'n_neighbors': [3,
[Machine Learning with Python] Data Preparation by Pandas and Scikit-Learn
In this article, we dicuss some main steps in data preparation. Drop Labels Firstly, we drop labels for train set. Here we use drop() method in Pandas li
[Machine Learning with Python] My First Data Preprocessing Pipeline with Titanic Dataset
The Dataset was acquired from https://www.kaggle.com/c/titanic For data preprocessing, I firstly defined three transformers: DataFrameSelector: S