Keyless security system
KEYLESS ENCRYPTION WITH EQUAL PROBABILITY
ALGORITHMS
ABSTRACT
Today, most of our lives are spent in the information world, our identity has become digital, has become
an array of data that describes us, identifies us. Our information security has become identical to our
own security, becoming heavily dependent on security systems. Keys and passwords remain the weakest
link in data protection systems, and it is this vulnerability that is most frequently exploited. Is it possible,
even theoretically, to have cryptography without keys, authentication without passwords, a world without
phishing, a security system - in which the "human factor" will finally be leveled?
KEYWORDS
Equal-probability (keyless) encryption algorithms, passwordless authentication, multifactor alternative
digital signature, leveling the "human factor", protection against phishing attacks, completely closed
communication channels
1. INTRODUCTION
This is the idea of symmetric cryptography, where everything is reversed. Is it possible to
encrypt and not agree on rules in advance? Is there room for uncertainty where accuracy is
needed? Never has the consequence come before the cause. There has never been a tomorrow.
Our world today is communication without borders or limitations. Our future is all about new
communications, exchanging data with many robotic systems, with smart homes, smart
assistants, and cars. The digital world is evolving faster than we have time to get used to it. New
communication systems need new security technologies. The trend of weakening information
security through compromising the "human factor" is steadily growing. The quality and danger
of attacks on key and password infrastructure, the compromise of which is unacceptable in the
digital ecosystem, is increasing. We are used to the fact that keys and passwords are mandatory
attributes of any security system. Changing the key-lock pair makes no sense as long as
everything is working fine, but over time, the likelihood of compromising the key steadily
increases. If you change the key at least sometimes, the probability of compromising the key
decreases in direct proportion to the frequency of such change. If you change the key very often,
the possibility of discredit will approach zero, the key complexity factor will stop dominating
and the need to guard the key will decrease in direct proportion to the key lifespan. And if we
change the key even more often... If you change the key as often as possible, the key complexity
factor can be neglected and the danger and value of its compromise - will tend to zero.
In such circumstances - the value of the key will also tend to zero. Then it is not clear what will
be the criterion of the "master", the protection factor? For example, if instead of a pair of keylocks, which unlock the door, make a secret only the direction of opening the "door", and add
the condition that only one attempt is allowed, then the probability of breaking such a door -
will be 50%. If we install 256 such doors and each door has its own rule of opening, allowing
only one attempt, then such a security system would be equal in security to a 256-bit key but
without the possibility of breaking in. But is it practical to put 256 doors instead of one key and
door? What is the point of creating such a logical tunnel of rules if knowing 256 rules of
opening is the same key! Let's not jump to conclusions, but if we allow only one attempt to open
them, the number of such doors can be considerably reduced without loss of reliability, say, to
16. And if we add a time limit for each rule, the number of "doors" can be safely reduced to 8,
that is 256 different rules. Note, if such a keyless system, uses a large number of rules (for
example, a group of 1000 rules) to open 8 doors (we have 256 different rules), and all (1000)
rules quickly replace each other in strict order, and also, provided that any one rule is relevant
only in 1 very short period of time, but they all together (all 1000 rules) guard one secret (one
open message: Om), then compromising any one rule of 1000 - will not lead to compromising
the whole secret (Om). Nothing special, except that in this concept, the number of doors can still
be reduced, for example, to "obscene" number: to 4 (that is 16 different rules), provided that the
total number of rules necessary to protect one secret (Om) is still large, for example, 1000. It
was 256, and now it's 4, progress. But there are a lot of additional conditions, is it realistic to
meet them?
These observations allow us to conclude: no matter how low the level of difficulty of a
problem falls, if the time and number of attempts allotted to solve it tends to zero, then, as
a result, the difficulty of solving such a problem - greatly increases! This philosophical
essay formed the basis of a conjecture, a timid intellectual fantasy on the subject: "Is it possible,
at least theoretically, to have another cryptography, without a key, or at least without a guarded
secret, which is needed to agree on a general encryption scheme, before Om starts encryption?"
This is "cryptography in reverse": many different encryption schemes are created in the
beginning, a unique chain of these schemes following one another is created, this group of
schemes will be relevant only once, only for one sequence of events. And this technique is used:
this chain of encryption schemes is not agreed before encryption, is probabilistic, but it will be
possible to verify it by analyzing the result - future Om. Then it turns out that the decrypted Om
we have in the present is an undefined version of the original Om. And we don't know or read
the original Om until the next Om is obtained and deciphered! This is cryptography in reverse, I
have nothing against it if such a concept can work reliably...
Keyless encryption, and this concept as a whole, at first glance, do not look reliable. Is it
possible to create a single secret encryption scheme, let alone many symmetric schemes,
without a matching factor? Who knows... Without a key, reliable encryption, in a variable
environment, can be implemented only when all the encryption schemes - will have equal
probability of being selected, from a very large set and without a fixed order of selection. Only
in this case, the ciphercode will not contain any information, which can be used - to calculate
correlations (keys), which do not exist anywhere, a priori.
If the ciphercode is formed in a variable environment consisting of a set of equally
probabilistic encryption schemes, the factor certifying the author of Om will be the
领英推荐
encryption scheme itself, relevant and expected for the recipient at this very moment in
time, for this very Om. Such a factor, if verifiable, is a logical analog of the trivial permanent
authentication factor. But there is an important difference: our factor is always new, variable,
unpredictable, actual only for one generated data packet**, it is a phantom. Verifying the client
through a mechanism to verify its current encryption scheme - is an alternative way of
passwordless authentication, which is performed continuously, always by a different
digital factor, deterministic only for this Om, to be used once. The reliability of the variable
numeric factor authentication is much higher than that of the permanent or/and temporarily
assigned factor because it takes place continuously with new factors, in a unique sequence. And
if you look even deeper - it completely neutralizes the danger of phishing attacks through
the vector of compromise of human factors, key and password information.
And that's not all. There is nothing more unpleasant for the attacker than not knowing the
client ID, especially if it is variable. The cipher code itself acts as a logical analogue of the
client (and server, this process is bidirectional) identifier, which upon receipt undergoes the
procedure of identification** according to the principle "friend" - "foe". The procedure of its
verification is an integral part, integral to the process of decryption, carried out by the method of
reconciliation of small linear blocks, in certain rounds. Identification of the author takes place
before reading the decrypted Om. Note that the equal probability concept of cipher code
formation, in fact, is observed in the Vernam cipher system. For every (any) bit of the
ciphertext, in this system, any bit of Om, any of the set "0" and "1", is always equally probable.
And the same concept of Vernam's system is also true for a byte of any length - equal
probability of all possible variants. It is proved by K. Shannon that encryption in the concept of
"equal probability (EP)" - has absolute reliability, unpredictable, not amenable to cryptanalysis.
This is probably the only absolutely reliable principle of encryption. Note that in EP there is no
analytical possibility of repeating any cipher, regardless of the content of Om itself, regardless
of the number of cycles of repeating information!!! That is why, only EP encryption concept is
able to provide absolute crypto-proofness of a cipher, it does not allow realization of intentional
repetition of a cipher, but leaves a place for case of its repetition, thus, expanding (instead of
narrowing) the number of possible cipher variants! Let's try to stick to this principle.
Cryptanalysis of equal probability cipher code is not possible due to lack of base for analysis. In
Vernam ciphers, this is explained simply - because a one-time binary tape is never repeated - it
is a key and it is always both variable and random. Therefore, in a Vernam cipher you can never
predict the correlations between a future cipher and any Om, they are always different, always
formed by a new random function. This is what it means that there is no basis for cryptanalysis,
no correlations, it's as if - no key, no key at all... Variable nature of the key - gives the illusion of
its absence, there are no constant rules in the encryption system, it is deterministic chaos...
Suppose, in a general sense, that the cipher system, in which keys are not used, and cipher
schemes can be chosen according to the equal probability principle, is a cryptosystem
similar to the Vernam cipher system, in which keys are always used, but at the same time,
have the highest level of entropy and are never repeated. This similarity is observed because
the value of any byte of the ciphercode formed by either of these 2 systems will always have an
equal probability of hitting the ciphercode - regardless of the content of Om. The ciphercode
and Om have a once-one definite rule of interaction, which will never happen again if the
ciphercode is of length, for example, from 256 bits. Otherwise, the Vernam system uses
unacceptable quality keys, and in a keyless system, the choice of encryption schemes is not
equally probabilistic. Moreover, if we theoretically consider the probability of repeating a cipher
code as a whole, such probability exists only for different contents of open messages and,
conceptually, should be equal to zero - for the same Om. Suppose that a system in which the
ciphercodes for 2 Om are always created in 2 different encryption schemes, the difference
between which is established without using a key, is similar to a system in which a new key is
always used. Exactly from this point of view, the ciphercode formed in the concept of
keyless technology is completely analogous to the Vernam system ciphercode. The
difference lies in the presence (Vernam's system) or absence (this concept) of a random (good
pseudorandom) function which is exchanged (keys) or not exchanged (here) on separate
communication channels, to agree on new cipher schemes. To make a complete description of
the keyless technology does not allow the limited space of this article. But it is possible to
briefly describe the "key" (main) points, which will allow the attentive reader to draw his own
conclusions about the prospects of this idea. To create a theoretical model of such a
cryptographic system, a different, special encryption method was needed, and a number of new
functions that are not characteristic of trivial cryptography, successfully resolving contradictions
between equal-probability random choice and continuous pair symmetry of different encryption
schemes. Keyless encryption technology, as a consequence, forms such a communication
channel, in which not only volume of transmitted data (Om), direction of information transfer is
hidden, but even fact of information exchange is not observable**(!), actually completely
closed communication channel (WCC) is created. The only parameter accessible to the external
observer is a calculation of maximum possible (as a probable parameter) volume Om which
users could exchange - for observable period. Such communication channel is ideal for hiding
the fact of sending commands to control robotics, masking the level and fact of communication
in security systems, closed data exchange over open communication channels in point-to-point
configuration, without the danger of hacking WCC through a vector of attack on the
"human factor", without phishing, compromise keys and passwords, in view of their
physical absence, on the fundamental level of technology functioning.
In this article, some new encryption concepts and methods will be shown, which are the core of
a functionally complete variant of the keyless system - Keyless Code Generator, KCG.