Introduction Of Cryptography
Cryptography is a means of encoding information and communications in such a way that only those with access to it can decode and process it. Unwanted access to information is thus prohibited.
The word “crypt” suggests “hidden,” while the suffix graph signifies “writing.”
The procedures used to safeguard information in cryptography are derived from mathematical principles and a set of rule-based calculations known as algorithms that change signals in ways that make them difficult to decode. These algorithms are used to produce cryptographic keys, digitally sign documents, ensure data privacy, browse the internet, and protect sensitive transactions like credit and debit card transactions.
Before the current era, cryptography was almost synonymous with encryption, turning information from readable to unintelligible nonsense. The sender only shares the decoding procedure with the designated receivers to prevent intruders from obtaining access to an encrypted message. In cryptography literature, the names Alice (“A”) for the sender, Bob (“B”) for the intended recipient, and Eve (“eavesdropper”) for the adversary are frequently employed. Cryptography methods have become increasingly complex and their applications have been more varied since the development of rotor cipher machines in World War I and the introduction of computers in World War II.
Modern cryptography is heavily based on mathematical theory and computer science practice; cryptographic methods are designed with computational hardness assumptions in mind, making them difficult to crack in practice. While theoretically conceivable to break into a well-designed system, doing so in practice is impossible. If properly developed, such schemes are called “computationally secure.” However, theoretical advancements (e.g., improvements in integer factorization algorithms) and faster computing technology necessitate ongoing reevaluation and, if necessary, the adaption of these designs. Information-theoretically safe techniques, such as the one-time pad, which can be proven to be unbreakable even with infinite computing capacity, are far more difficult to employ in practice than the best theoretically breakable but computationally secure systems.
More broadly, cryptography is concerned with developing and evaluating methods that prohibit third parties or the general public from reading private messages; current cryptography emphasizes data secrecy, data integrity, authentication, and non-repudiation. The fields of mathematics, computer science, electrical engineering, communication science, and physics all meet in modern cryptography. Cryptography uses include electronic commerce, chip-based payment cards, digital currencies, computer passwords, and military communications.
The evolution of cryptographic technology in the Information Age has resulted in a slew of legal issues. Because of its potential for surveillance and sedition, many countries have categorized cryptography as a weapon, banning or prohibiting its use and export. In some places where cryptography is permitted, investigators may be able to compel the surrender of encryption keys for documents pertinent to an investigation. Cryptography also plays a crucial part in digital rights management and copyright infringement disputes in the case of digital media.
Also Read: Autopsy in Cyber Forensics
Techniques used For Cryptography
Cryptography is frequently associated with the process of transforming plain text to ciphertext, which is text encoded in such a way that only the intended receiver of the text can decode it, a process known as encryption, in today’s computer age. The process of transforming ciphertext to plain text is known as decryption.
Features Of Cryptography are as follows
- Confidentiality- Information can only be viewed by the person for whom it is meant, and no one else can see it.
- Integrity- Information cannot be changed in storage or transit between the sender and the intended receiver without being noticed.
- Non-repudiation- The information creator/sender cannot dispute that he or she intended to convey information at a later time.
- Authentication- The sender and receiver’s identities are verified. The information’s destination/origin is also confirmed.
Cryptosystems are procedures and protocols that meet some or all of the above characteristics. Cryptosystems are frequently assumed to apply primarily to mathematical procedures and computer programs; nevertheless, they also cover human conduct such as selecting difficult-to-guess passwords, turning off unused systems, and avoiding discussing confidential operations with strangers.
Types Of Cryptography
Generally, there are three types of cryptography:
1. Symmetric Key Cryptography
It’s an encryption system in which the sender and receiver of a message use the same key to encrypt and decode communications. Symmetric Key Systems are faster and easier to use, but they have the drawback of requiring the sender and receiver to exchange keys securely. The most extensively used symmetric key encryption system is Data Encryption System (DES). The transmitter and receiver utilize the same key in symmetric-key cryptography (or, less commonly, in which their keys are different, but related in an easily computable way). This was the only type of encryption that was publicly known in June 1976.
2. Hash Functions
There are no keys used in this algorithm. Based on the plain text, a hash value with a fixed length is calculated, making it impossible to reconstruct the plain text’s contents. Several operating systems employ hash functions to secure passwords.
3. Asymmetric Key Cryptography
This technology encrypts and decrypts data using a pair of keys. Encryption is done with a public key, and decryption is done with a private key. The terms “public key” and “private key” are not interchangeable. Even if everyone has access to the public key, the intended receiver can only decode it because he has access to the private key.
History of Cryptography
The Caesar Cipher was the first of many ciphers used in cryptography. In comparison to modern cryptographic techniques, ciphers were much easier to decipher, although they both needed keys and plaintext. Crypts from the past were the first kinds of encryption, notwithstanding their simplicity. Today’s algorithms and cryptosystems are significantly more advanced. They use multiple rounds of ciphers and encrypt the ciphertext of communications to provide the most secure data transport and storage. There are now irreversible ways of encryption, ensuring the message’s security indefinitely. The requirement for data to be safeguarded more and more securely is driving the development of more complex cryptography solutions.
The majority of ciphers and algorithms employed in the early days of cryptography have been cracked, rendering them ineffective for data security. Today’s algorithms can be decrypted, but deciphering the meaning of a single message would take years, if not decades. As a result, the competition to develop newer, more powerful cryptography algorithms is still on.
Cryptosystems encrypt and decrypt data using a set of techniques known as cryptographic algorithms, or ciphers, to provide secure communications between computer systems, devices, and applications. One method is used for encryption, another for message authentication, and yet another for key exchange in a cipher suite. The following steps are included in this process, which is integrated into protocols and written in software that runs on operating systems (OSes) and networked computer systems. For data encryption/decryption, digital signing and verification, and message authentication key exchange, public and private keys are generated.
Attackers can get around cryptography by hacking into systems that handle data encryption and decryption and exploiting flaws such as the usage of default keys. Cryptography, on the other hand, makes it more difficult for attackers to gain access to messages and data secured by encryption methods.
Concerns about quantum computing’s processing capacity breaking current cryptography encryption standards prompted NIST to issue a call for papers in 2016 among the mathematics and science communities for new public-key cryptography standards. Quantum computing, unlike today’s computers, uses quantum bits (qubits) that can represent both 0s and 1s, allowing it to do two operations at the same time.
While a large-scale quantum computer is unlikely to be created in the next decade, NIST believes that the existing infrastructure requires the standardization of publicly known and understood algorithms that provide a secure method. The deadline for submissions was November 2017, and it is predicted that the analysis of the plans will take three to five years.
Many systems employ private key cryptography to safeguard transferred information and ensure secrecy during transmission. Without a master key or a huge number of keys, one can retain confidentiality with public-key systems.
In cyber security
By encrypting communications, cryptography can be used to secure them. HTTPS is used to encrypt websites. “End-to-end” encryption, in which only the sender and receiver can read messages, is used in Pretty Good Privacy for email and Signal and WhatsApp for secure messaging in general.
Encryption is used by operating systems to keep passwords safe, hide sections of the system, and verify that software upgrades are genuine from the manufacturer. Instead of saving plaintext passwords, computer systems maintain hashes; when a user signs in, the system runs the given password through a cryptographic hash function and compares the result to the hashed value on file. In this way, neither the system nor an attacker has access to the plaintext password at any time. Encryption can also be used to encrypt an entire hard drive. For example, University College London has adopted BitLocker (a Microsoft technology) to make drive contents inaccessible without requiring users to check-in.
- It is used in electronic money, blockchain, and cryptocurrency
More Info is Here