You might consider some algorithms to be variations of others - this could easily influence your count by a large amount. How exactly could you use a key-exchange protocol for asymmetric encryption? Using rsa everybody can use the public key to encrypt something only the private key can decrypt. Since matrix size is the bottleneck, we could state that dh-1024 is so thats one more advantage of dh it can be argued that it gives some extra robustness over rsa keys.
The total cost of an ephemeral key-exchange using ecdh is one scalar multiplication with fixed base (key-gen) and one scalar multiplication with variable (the actual key-exchange) plus an rsa private key operation using the long-term key signing the ephemeral public key. Would commercial value exist in creating a new method? To this end, is it a problem being pursued in the world today? (classical computing - i am aware quantum is an academic direction at this time) i realize that second question is maybe opinion based, and so i would just really appreciate any thoughts on the subject. These are two distinct beast although they share the same core mathematical operation and format for keys, they do different things in different ways.
In that case, to get pfs, you need to generate a transient key pair (asymmetric encryption or key exchange) for the actual encryption since you usually want some sort of authentication, you may need another non-transient key pair at least on one side. That part is about reducing a terrifyingly large matrix. This is compatible with a one-shot communication system, assuming a pre-distribution of the public key, i.
For the curious, there were 59 kempke schemes submitted to nist (according to the previous link). Pfs, in particular if you want to be able to eavesdrop on your own connections or the connections of your wards (in the context of a sysadmin protecting his users through some filters, or for some debug activities), you need non-ephemeral keys. Stack overflows products and services, including the stack overflow network, is subject to these policies and terms.
I am aware that each would have various strengths and weaknesses, etc. Discrete logarithm on elliptic curves is not the same problem as discrete logarithm modulo a big prime gnfs does not apply. In some specialized protocols, ecdh (with elliptic curves) gets an important edge because the public elements are much smaller.
This means that even if the long-term key is leaked at a later date, the session keys for individual connections are not compromised, even if the full data stream is captured. If you look at the details, though, you may note that the last part of gnfs, the linear algebra part, which is the bottleneck in the case of large keys, is simpler in the case of rsa. Dh is computed, are reused, which does not entail extra risks, as far as we know). Asymmetric encryption and key exchange are somewhat equivalent with asymmetric encryption, you can do a key exchange by virtue of generating a random symmetric key (a bunch of random bytes) and encrypting that with the recipients public key. In that case, add another advantage in full static dh (client and server both have a certificate with a dh public key in it, and both keys use the same parameters), then they can get away with poor or inexistent random number generation, replacing it with a mere counter.
The total cost of an ephemeral key-exchange using ecdh is one scalar multiplication with fixed base (key-gen) and one scalar multiplication with variable (the actual key-exchange) plus an rsa private key operation using the long-term key signing the ephemeral public key. Performance rarely matters (at least not as much as is often assumed). So choosing dhersa instead of rsa kind-of doubles the cpu bill on the server for ssl -- not that it matters much in practice, though. Cryptographic algorithm design is a poor business strategy, because you are competing with widely used, freely available, efficient, standardized algorithms. Since the algorithms dont do the same thing, you could prefer one over the other depending on the usage context.
It so happens that the best known breaking algorithms for breaking either are variants of the. You could create specialized hardware for performing your algorithm very quickly or with very little power and sell or license that. In that case, add another advantage in full static dh (client and server both have a certificate with a dh public key in it, and both keys use the same parameters), then they can get away with poor or inexistent random number generation, replacing it with a mere counter. Dh key includes the dh parameters it is smaller otherwise. So research into it is indeed big at the moment, but if you are were hoping to design an algorithm and have it standardized, you probably just missed your chance.