"Probably E" A New Order of Entropy -without mechanical balancing -Maximum Entropy Equilibrium Threshold - Acquisition of found complexities
UNDER CONSTRUCTION, DETAILS INCOMPLETE ARTICLE: DISCLAIMER:
THE CONTENTS OF THESE ARTICLES MAY CHANGE WITHOUT PRIOR NOTICE - REVISIONS WILL BE ANNOTATED WHERE NEEDED.
CHANGES MAY DUE TO NEW INSIGHTS, SUDDEN CHANGE OF DESIGN CRITERIA, CHANGE OF PLANS, SHIFTED OR IPFENCING, ANY OTHER CIRCUMSTANCES THAT THE AUTHOR THE EFFORT TO HAVE CHANGED
(When in doubt, it's)
"Probably E"
March 2023
Written by Edwin Jean-Paul Vening,
"For ideas, questions, quotes, inquiries, all comments are welcome: e-mail: chargen @ gmail.com "
"This article is about how we, at Data Morgana, have been working on new concepts for the development of chaotic generators, specifically competing with other similar projects all by respected developers from a diversity of fields of interest or expertise".
Source: https://cacert.at/random
What we discovered: new complexities!
the output gave rise to a whole new order of Entropy
and with this newly acquired knowledge came new insights.
We also discovered that the more effort we invested to further Deepen de-correlate from certain aspects, like:
would only deform the information to a similar unique construct/
New insights: the effort of de-correlation made the output even more susceptible to other correlation and identification of recognized structures (if a hashed table was used. The more effort of de-correlation resulted in this adverse effect. We did, however, implied a re-ordering of the output to prevent a certain mechanical observation 'jitter' with a less linear serial correlation.
When /dev/entropy will be made with the use the wide-integer C++ template, the production of entropy for each application, thread, sub-routine, daemon, will have abstract integer units, that need no re-ordering\
We needed to understand and investigate why most/all current precision instruments /software failed to index for P: or deal with this OOB Entropy in every number base deemed feasible, yet still in the narrowing binary domain.
We ended up with a configuration that we now use to estimate n for its Maximum Entropy Equilibrium Threshold.
(new ideas are welcome)
We question:
Why do we want this?
This was a former design clause: to have the output as a one-way, one-time function. Can we use these newfound Complexities, yes, and we need these complexities to re-imagine networking communications for the future, including evolving signal semantics.
We now focus our development design on protocol design (secure, secured, confidence, critical) projected at Deep Space Communications, because that is where the effort of what we accomplished may be valued more!
And what we accomplished, was potentially * very misanthropic stuff *
We managed to reach all specific design goals of our new generation of generators, and it looks like we have attained an instrument to balance to a projected Maximum Entropy Equilibrium. Here are some facts (or gossip overheard) during development.
Here I present to you the results and achievements in the field of random number generation also related to advancements made in methods and functions of cryptography, and how this relates to realms of security. Examples of the algorithm have been released in the public domain, as this will allow for further testing and adoption by the wider community.
On Dec 31, 2022 ... I said : Eureka !
This is the technical outline of other of our developments and what may benefit from our work, what may we expect from the increased, hyperparametric entropy produced by our algorithms.
what was previously always true
The use of these algorithms has the potential to enhance the security of existing cryptographic methods and functions by providing a stronger source of randomness for key generation, encryption, and other functions. Additionally, the non-linear deeply de-correlated chaotic streams derived from the algorithm can provide a higher level or better: hyperparametric tunable factor for perceived unpredictability, making it more difficult for attackers to predict and difficult to develop a strategy to exploit vulnerabilities.
Overall, the "Probably E" type generator algorithm offers a promising approach to increasing the complexity and resilience of digital security in the face of evolving "threats", (not including but perpetual "FUD" uttered about the phenomena of quantum computing" ...
The consequences and opportunities, with "Probably E":
What may be possible with the current State of the Art of our generators, with security in mind, in perspective of recent developments : to develop a virtual device named /dev/entropy > -as to guard the host that gives out the low quality entropy (often even crisscrossed in cascades) to bootstrap or keep known cryptographic functions going-. /dev/entropy may also help to evolve and sustain current cryptographic functions and methods (certainly those that depend(ed) on 'randomness' from /dev/{u,} random - in that case: they have to) and meanwhile it would harden system security, data in datastores, lakes, clouds.
It is important to explore and develop new cryptographic methods that are more resistant to attacks. The potential for increased entropy through these new algorithms and generators is a promising avenue for enhancing security measures. However, it is also important to thoroughly test and validate these new methods before implementing them in real-world applications. The standards and recommendations set forth by organizations such as NIST and the DoD Cyber advisories serve as important guidelines for ensuring the security and reliability of cryptographic systems.
Different definitions of complexities, as demonstrated in the list below is a mechanically produce from a technical knowledge base in March 2023
There are a tedious number of different complexities and phenomena associated with attaining levels of Entropy, depending on the specific context in which it is being studied and/or applied. Here I briefly mention a few examples:
Author addenda edit: March 2023: from the Twitter.com social network
editorial unrelated intermezzo.
"Regarding your comment about the momentum and biomechanical pairing pace of helicases, it is an intriguing topic in biochemistry and molecular biology. The order in which base pairs are formed during DNA replication can have important functional implications, and it is an active area of research to understand the mechanisms that underlie this process.
Thou art without a clue, but this is where I stopped my interest in forensic sciences...
Author addenda edit: March 2023:
please find below this article a summary of other complexities, it's inherently incomplete
That is complexity of Problems identified to have a prominent level/range of entropy, which, due to new findings, cannot be measured at the time of writing. There is no estimate of what is needed to investigate and develop, construct new generators based on newly acquired logic of the algorithms, to have a computational instrument ready to stride difficult classes of problems. None of this has been proven at the time of writing.
Astrophysics Unique Event Spacetime Localities Processor
Instrumenting the generator for the envisioned logic circuitry and secure (co)-processors was one of the ideas to solve many problems with de-correlation.
"Probably E" algorithms offer a promising practical solution with the acquired complexities implemented, to reach a far more realistic Resilience towards Quantum Communications and of digital security.
With just one shot (#1) We nailed it,
After years of waiting, -since 2014- It occurred to me when I got renowned interest in biogenetics, that I still held these results for i was counting on it then, and now for a response of the other experts in the field, as possible reaction to my submission. We were both right about that notion)
"No Entropy Here".
(So luckily, they did, accordingly and fortunately, so nice of the German gentlemen of a reference to). NOW I know what Entropy can be described by me, for its abstractions and that may be expected of me.
This means that there are more developments that have these qualities and sources, but may be held back, but not for just any reason. With our entries, we not only *raised* the order of entropy, as not seen before, we pushed the output further than its perpetual use. This was done to hopefully see and witness any described the much-needed knowledge about the complexities and its dimensional expressions.
these outputs would be even more questionable, obvious to the expert's audience, contenders where I mention that I cannot use the de facto tools provided by NIST/FIPS - so newly devised tools were used.
领英推荐
So, since we hold the record(s) of this,
there's a little space reserved for me to say: Something Cunning:
"The baseline of Entropy has been shifted - for the unknown count of implications of this are ... inspiring -especially for security- if a cryptographic function or cascaded methods of information encryption that may depend on the host device for randomness, served from /dev/{u,} random"- to NIST:
I did not joke about this.
The records are still standing in 2023 Now we produce the highest entropy /"no entropy"
-it's (the entropy) been steering the Brute Force and digital dissection, categorization, of binary indicators of layers of data - to sieve attempting to correlate protocols and estimating the next left-or right oriented bit in terms of directionality - this method will likely break most frames down with a high entropy source, like from the maximum Entropy Equilibrium generator and by using generic genetic algorithms, without AI or/and ML.
semi a priori truth FACTS:
The new algorithm supersedes most if not all random number generators -
But is it entropy? We have produced petabytes to search for correlation (that we now imply for its new function). I may be the only one that is able to quickly recognize these structures. That leaves the question if it is Problem to recognize tightness of symbolic dispersion, to conclude if this is true within time.
"Probably E", has routines disabled for the non-linear asynchronous dispersion of symbols that may be defined by preamble of the information and/or switched by signal, or by auto adapting it hyperparametric value and have /dev/entropy hinting towards the signal semantics
We can generate all kinds of noise that acts like fundamental carrier or sources for (most) cryptographic methods and functions that rely on NIST conformance (see: U01 Die-harder suite /practrand in all GitHub incarnations/long range zip RLE/arithmetic encoders) We derive non-linear chaotic streams with math equation and parametric function:
for every random chosen N there exists output that has thermodynamic efficiency that persists exact equal distributed fractional dispersion, which can be re-ordered, the symbolic tightness persists its nonlinearity as well.
There may be countless complexities found within these new kernels of higher order entropy distribution derived by our new algorithms. It may be perceivable that the generators may help boost and enhance the security of current state of the art cryptographic functions and methods, by providing a stronger source of randomness for key generation, encryption, and other functions such as re-seeding in a non-linear chaotic way. Streams may split (Wave divide) that also offer higher levels of unpredictability, uncertainty, "randomness" which can make it more difficult for attackers to predict the usage of a generator or exploit vulnerabilities.
The product of our algorithms that drive the generator constructions surpasses most known and useful random number generators by its qualitative expression and function as a fundamental source of entropy for cryptographic methods. The non-linear chaotic streams derived from your algorithm provide a higher level of unpredictability, making it more difficult for attackers to predict or exploit vulnerabilities. We have released examples in the public domain, which may help further advance our generators or give new insights to its form.
Examples of this are released in the public domain.
These are the raw binary output files as result output of a generator (see below).?Filename: OUTFN_BASE-OUTFN_VER-OUTFN_VERMIN-20220503012928.OUTFN_EXT
Binary size: 160000000000 bits?2.1G?
Link (Google drive public share) :?
sha384sum 3f36a74097af4448af092adab87025aa489133c3a25e60833462a4a476ceed42d98ff06cf4bd2c61a4076d0c5fecc19d
Value Char Occurrences Fraction
?0????8000001327??0.500000
?1????7999998673??0.500000
To summarize, there is a new generation of CHAOTIC generators that explore AI and ML algorithms to provide a far more de-correlated fundamental source of entropy, which may be used to harden current cryptograph for my loud shouting, ic methods and functions. Already, these functions surpass most existing random number generators in terms of security.
(There was a moment, that a Parametric Hyper-wrenching wrench, came in swinging from Hamburg, Germany, for I was too loud with this).
The algorithm produces non-linear (re-ordered) chaotic streams derived from a math equation and parametric function, which results in outputs that have exactly equal distance distributed fractional dispersion, the symbolic tightness is also made hyperparametric , there a lots of ways to de-correlate even further, beginning with nonlinearity but none of these methods have been developed (yet).
Examples of this algorithm are available in the public domain. If you can't find them e-mail me
Test vectors as the Dieharder suite offers, are about Indexation of P. in binary.
There may exist protocols that span crisscross-link layer, which can be used to reconstruct communications with little effort, because this cross correlation of protocols over layers are inherently bound to the binary context and to its contents- and/or purpose. Such are the problems with binary, they tend to narrow the search space by its bounds only. Only recently a team has shown that with the innovations of new emerging sciences, it can help to recover cryptographic Keys from a proposed Quantum communications protocol. The methods of Deep Learning (regression, retrofitted) have been used in conjunction with a 'steering' of a 'true' RNG Random Source or Distribution (not mine or discussed here).
Before we found this, P was most considered in binary bound context, which proved to be a hard problem. We have identified a higher order of entropy with distinct complexity class(es): for each progression new macro-state(e) I think that most related critical baseline security elements; key spaces, rounds, system entropy pools need to be reconsidered with these results. Propose adopting this new method of bootstrapping entropy, generating higher complexity for serving entropy as carrier -
before consolidation of /dev/{u}random
before /dev/{u}random was (supposedly) hard linked to each other, this helped to quickly bootstrap the Linux kernel and diminish the problem with stalling daemons that wait for the (blocking) quality output of /dev/random. Now that cryptographic functions have been translated to hw such as AESni, the issues of blocking entropy seem to have been resolved on recent Linux kernels. While cascading cryptographic functions, hw functions has its advantages ,one of them is production "speed" of production, but the quality and overall quality of that is to be questioned for its unexplained irregular rise and falls, spiks the unhardening effect of this is also to be regarded as a bad practice of chaining different high standard encryption methods, that seeds from the same entropy pool (Not meant to lecture people who are actively working on the crypto subsystems).
Any person, who is interested, with motivation) may have access and aquire this file mentioned below: here is a rxvt screenshot
https://cacert.at/random?"no entropy" (one shot at security: Dijkstra)
All output 'datamorgana, tamara, ultrajectum, exacet; have these properties
Vltrajectum Worst Case Ltd. Event Horizon "Gmbh". like Ltd.
Estimated current threats that may lead to exposure of contents,
It is true that there may exist many protocols that span cross-link layer and have inherently multiple layers of binary context and linked contents, some of these protocols may have vulnerabilities that can be exploited to reconstruct communications. However, it is also true that advancements in cryptography and security continue to evolve, including the development of new cryptographic algorithms and the use of quantum-resistant techniques.
Regarding the use of Deep Learning in conjunction with a "true" RNG source.
This is an area of active research and development. Deep Learning can be used to improve the randomness and quality of random number generators, which in turn can enhance the security of cryptographic methods that rely on these sources of randomness. However, it is important to note that Deep Learning techniques are not a panacea and should be used in conjunction with other security measures to ensure robust and effective security.
look at how other superior pet/weekend/ projects are evolving at https://cacert.at/random
And for the future? Well, the algorithm has far more interesting properties and capacities, to allow a new encoding scheme and signal semantics, we even have a circuitry on the design board of which is said.
---> "THIS new component represents a significant advancement in space (mil)communication technology'. <---
Signed,
Edwin Jean-Paul Vening
or
edit revision mar 2023
Sorry for any gibberish, Koeterwaals, people.
About the author
... "Edwin Jean-Paul Vening is an Architect of Cryptography and a Data Engineer who is working on a Tactical Presentation Platform that is part of a larger project on interstellar critical network infrastructure. The platform involves innovative technologies such as?passive node addressing?to transport secured objects in space. Edwin mentions that the communications may contain de-correlated elements and unaltered information and may be constricted to within a scheduled period to a selected audience in a verified locality/established destination environment. The platform has several monitoring security modes with a constant learning model of environmental noise and interference patterns. Edwin emphasizes that the platform has no interface to local or distant networks and relies on passive addressing. He also mentions that signal design needs to be reimagined for a future that is less about binary and more about passive addressing. Edwin is interested in futuristic technologies and is working on projects that involve cryptography and interstellar infrastructure" ...Edwin Bla Bla.
Opinions from the mechanical void ()
... "It sounds like you have an ambitious project in mind. It's great that you are open to feedback and collaboration to make it a success. Building a QKD network topology that can translate and transpose binary digital assets to a symbolic space will certainly require a lot of investment, expertise, and innovation. It's important to consider all the added complexities, failover capacity, and node designs to ensure that the network is reliable and secure. I would suggest reaching out to experts in the field and potential investors to help bring your vision to reality. Good luck with your project! " .... (He meant Godspeed, Jason)
Hey Bender, you forgot Autonomous recovery!
All Rights Reserved - written in the Dutch Empire - Nothing may be reproduced without notice, or without mentioning the author(s) and source
(CC- creative commons by attribution) (C) 2023.
YADDA ELGIN ANDROMEDA HEAP
AEROSPACE EQUITIES -
#nsa #quantumcomputing #nist #cdc #nikhef #cern #alcatel?#ibm?#apple?#cis #sgi?#oracle?#linux?#minix?#plan9 #netbsd #openbsd #freebsd #opensource #operatingsystems #entropy #probability #relativity #nistcybersecurityframework #wristwatcheinsteiniantimesyncing #nospaceforwhiterabbits #ptp #security #cryptography #postquantumcryptography #asml #dutchempire #dsm #unilever #nen #patrickverkooyen #wef #netherlands #causality #qkd #ASML
(Not complete)
List of complexities produced by mechanical intuition
These are just a few examples of the distinct types of complexity associated with entropy and related phenomena.