Original author: 0xTodd, Ebunker co-founder
Editor's note: In May this year, Vitalik Buterin republished his 2020 article on fully homomorphic encryption (FHE). The article deeply introduces the relevant mathematical working principles of FHE. Yesterday, Ebunker co-founder 0xTodd explained the connotation and application scenarios of FHE fully homomorphic encryption in plain language on X. BlockBeats reprinted the full text as follows:
The market has not been good recently, and I finally have more time to continue sharing some new technical routes. Although the encryption market in 2024 is not as turbulent as in the past, there are still some new technologies trying to mature, such as the topic we are going to talk about today: "FHE / Fully Homomorphic Encryption".
V God also published an article about FHE in May this year, which is recommended for those who are interested.
So what kind of technology is FHE?
To understand the difficult term FHE Fully Homomorphic Encryption, you must first understand what "encryption" is, what "homomorphic" is, and why "full" is needed.
Ordinary encryption is most familiar to everyone. For example, Alice wants to send a message to Bob, such as "1314 520".
If now, a third party C is required to deliver the message and the information must be kept confidential, then it is very simple - just encrypt each number by x2, such as "2628 1040".
When Bob receives it, he divides each number by 2 in turn, and decrypts it to find out that Alice is saying "1314 520".
See, the two people have completed the information transmission through symmetric encryption, while hiring C to help but without C knowing the information. Generally, in spy movies, the communication between two liaisons will not exceed this scope.
Now Alice's requirements have become more difficult:
- For example, Alice is only 7 years old;
- Alice can only calculate the simplest arithmetic such as x2 and ÷2, and does not understand other operations.
Okay, now suppose Alice has to pay the electricity bill. Alice's monthly electricity bill is 400 yuan, and she has been in arrears for 12 months.
However, 400*12= how many, this question exceeds the calculation range of Alice, who is only 7 years old, and she cannot do such a complicated calculation.
However, she does not want others to know how much her electricity bill is / how many months, because this is sensitive information.
Therefore, Alice asked C to help her calculate without trusting him.
Because she only knew x2 ÷2, she used x2 multiplication to simply encrypt her numbers, and then she told C to calculate 800x24=, that is: (400x2) multiplied by (12x2).
C is an adult, with a strong computing brain, and quickly calculated 800*24=19200, and told Alice the number. Then, Alice took the result, which is 19200÷2÷2, and soon realized that she had to pay 4800 yuan for water.
Did you see it? This is the simplest multiplication homomorphic encryption. 800*24 is just a mapping of 400*12. The shape before and after the transformation is actually the same, so it is called "homomorphic".
This encryption method realizes: someone wants to entrust an untrusted entity to calculate the result, but can ensure that his sensitive numbers are not leaked.
However, just now it is just a problem in the ideal world. The problems in the real world are not so simple. Not everyone is 7 years old, or as honest as C.
Let's assume a very bad situation, for example, C may try to reverse the deduction, and C can also decipher that Alice wants to calculate 400 and 12 through exhaustive method.
At this time, "fully homomorphic encryption" is needed to solve the problem.
Alice multiplies each number by 2, and this 2 can be regarded as a noise. If the noise is too little, it is easy to be cracked by C.
Therefore, Alice can introduce an addition on the basis of multiplication.
Of course, it is best that this noise is like a main road intersection at 9 o'clock in the morning, then C's cracking difficulty is harder than climbing to the sky.
Therefore, Alice can multiply 4 times and add 8 times, so that the probability of C cracking is greatly reduced.
However, Alice is still only "partially" homomorphic encryption, that is:
(1) She can only encrypt content for a specific part of the problem;
(2) She can only use a specific part of the operation rules, because the number of additions and multiplications cannot be too many (generally no more than 15 times).
And "full" means that Alice should be allowed to perform addition encryption and multiplication encryption any number of times for a polynomial, so that a third party can be entrusted to complete the calculation and obtain the correct result after decryption.
A super long polynomial can almost express most of the mathematical problems in the world, not just the problem of calculating electricity bills for 7-year-olds.
Adding arbitrary encryption, it is almost impossible for C to spy on private data, and truly achieves "both".
Therefore, the technology of "fully homomorphic encryption" has always been a jewel in the holy grail of cryptography.
In fact, the technology of homomorphic encryption only supported "partial homomorphic encryption" until 2009.
In 2009, the new ideas proposed by scholars such as Gentry opened the door to the possibility of fully homomorphic encryption. Interested readers can also move to this paper.
Many friends still have doubts about the application scenarios of this technology. In what scenarios will the fully homomorphic encryption (FHE) technology be needed?
For example - AI.
As we all know, a powerful AI needs enough data to feed it, but the privacy value of a lot of data is too high. So can FHE achieve the "both" of this problem?
The answer is yes.
You can:
(1) Encrypt your sensitive data using FHE;
(2) Use the encrypted data to give AI calculations;
(3) Then AI spits out a bunch of garbled code that no one can understand.
Unsupervised AI can achieve this because the data is essentially a vector in its eyes. AI, especially generative AI such as GPT, does not understand the words we input to it at all, but it "predicts" the most appropriate answer through the vector.
However, since this mess of code follows certain mathematical rules, and you are the one who encrypted it, then:
(4) You can disconnect from the network and decrypt the mess of code locally, just like Alice;
(5) Then, you have achieved:Let AI use huge computing power to help you complete the calculation without touching your sensitive data at all.
But today's AI cannot do this, and must give up privacy. Think about everything you input to GPT in plain text! To achieve this, FHE is indispensable.
This is the root of the natural fit between AI and FHE. Thousands of words can be summed up in one word: both.
Since FHE is linked to AI and spans the two major fields of encryption and AI, it naturally receives extra favor. There are many projects related to FHE, such as Zama, Privasea, Mind Network, Fhenix, Sunscreen, etc., and the directions of FHE application are also creative.
Today, let’s analyze one of the projects @Privasea_ai. This is an FHE project led by Binance. Its white paper describes a very appropriate scenario, such as face recognition.
Both: the machine computing power can determine whether the person is a real person;
and: the machine does not handle any sensitive facial information.
The introduction of FHE can effectively solve this problem.
However, if you really want to do FHE calculations in the real world, you need a huge amount of computing power. After all, Alice needs to do "arbitrary" addition and multiplication encryption. Whether it is calculation, encryption, or decryption, it is a process that consumes a lot of computing power.
Therefore, Privasea needs to build a powerful computing network and supporting facilities. Therefore, Privasea has proposed a PoW+PoS network architecture to solve the problem of this computing network.
Recently, Privasea has just announced its own PoW hardware, called WorkHeart USB, which can be understood as one of the supporting facilities of Privasea's computing network. Of course, you can simply understand it as a mining machine.
The initial price is 0.2 ETH, which can mine 6.66% of the total tokens of the network.
There is also a PoS-like asset called StarFuel NFT, which can be understood as a "work certificate", with a total of 5,000.
The initial price is also 0.2 ETH, and you can get 0.75% of the total tokens of the network (through airdrops).
This NFT is also interesting. It is a PoS-like, but not a real PoS. It is trying to avoid the question of "Is PoS a security in the United States?"
This NFT supports users to pledge Privasea tokens, but it does not directly generate PoS income, but doubles the mining efficiency of your bound USB device, so it is a disguised PoS.
Back to the topic, if AI can really popularize FHE technology on a large scale, it will be a boon for AI itself. You must know that many countries now focus on data security and data privacy in regulating AI.
Even, to give an inappropriate example, in the Russian-Ukrainian war, some Russian military tried to use AI, but considering the American background of a large number of AI companies, the intelligence department would probably be riddled with holes.
But if AI is not used, it will naturally fall behind. Even if the gap may not be big now, given another 10 years, perhaps we can't imagine a world without AI.
Therefore, data privacy, from wars between two countries to face unlocking of mobile phones, exists everywhere in our lives.
In the era of AI, if FHE technology can truly mature, it will undoubtedly be the last line of defense for mankind.
Original link
欢迎加入律动 BlockBeats 官方社群:
Telegram 订阅群:https://t.me/theblockbeats
Telegram 交流群:https://t.me/BlockBeats_App
Twitter 官方账号:https://twitter.com/BlockBeatsAsia