Have you ever had the experience of booting a write-protect switch on a floppy disk to prevent booting and malicious overwriting; turning off the modem to prevent hackers from making calls at night; uninstalling the ansi.sys driver to prevent malicious text files Rearrange the keyboard so that the next tap directly formats your hard drive; check the autoexec.bat and config.sys files to verify that no malicious entries are self-starting by inserting them.
The above situation is hard to see now. The hackers have made progress and technology has replaced the outdated approach. Sometimes, these defenders are doing so well, letting the hackers give up the attack and turn to a more oily target. Sometimes, certain defenses are eliminated because we don't think it can provide enough protection or unexpected weaknesses. New technology waves, whether big or small, will bring new threats. The hackers quickly replaced the technology in their hands and threw the popular attack last year into the trash, while the security community was struggling.
Maybe there is nothing that can change so quickly like computer technology. As technology advances, the responsibility to protect it becomes heavier and heavier. If you have been in the world of computer security for a long time, you may have seen the birth and demise of many security technologies. Sometimes you are able to solve a new threat, and the threat itself is quickly out of date. The pace of attacks and technology continues to advance, and even the so-called cutting-edge defense technologies, such as biometric authentication and advanced firewalls, will eventually fail and exit. Here are the security defense techniques that are destined to enter history textbooks. If we open this article five to ten years later, it will definitely exceed your imagination.
No.1: Biometric certification
In the field of login security, biometric authentication technology is a very attractive medicine. After all, your face, fingerprints, DNA, or other biomarkers seem to be the perfect login credentials. But this is only the opinion of the layman. For experts, biometrics don't look so safe: if it's stolen, your biomarker can't be changed.
Enter your fingerprint. Most people only have 10. Any time you use a fingerprint as a biometric credential to log in, those fingerprints, or rather their digital IDs, must be stored somewhere for comparison. Unfortunately, it is too common for these numbers to be damaged or stolen. If the bad guys stole them, how can you tell the difference between the real fingerprint credentials and the digital identity in the other party?
In this case, the only solution is to tell every system in the world not to continue using your fingerprints, but this is almost impossible. This is true for any other biometric signature. If the bad guy gets a digital version of your biometric information, it is difficult to deny your DNA, face, and retina.
There is another situation, if the biometrics you use to log in, for example, the fingerprint itself is destroyed?
Adding biometric multi-factor authentication is a way to beat hackers, such as adding passwords and PINs to biometrics. But some two-factor authentication using physical elements can also easily do this, such as smart cards, USB key disks. If lost, the administrator can quickly issue you a new physical authentication method, or you can set a new PIN or password.
Although biometric signage is quickly becoming a fashionable security feature, they will never become ubiquitous. Once people realize that biometric logins are not what they look like, this approach will lose popularity or disappear. It is always used in conjunction with another authentication element, or just in scenarios where high security is not required.
No.2: SSL
Since its invention in 1995, the Secure Socket Layer (SSL) has been around for a long time. In these two decades, it has provided us with adequate services. But if you haven't heard of it yet, we need to introduce the Poodle attack, which makes the SSL protocol irretrievably far away. The replacement for SSL, Transport Layer Security (TLS), performed slightly better. In all of the security techniques described in this article that are about to be thrown into the trash, SSL is the closest to being replaced. People should not use it anymore.
Where is the problem? Hundreds of websites rely on or enable SSL. If you disable all SSL, this is also the general default in the latest version of popular browsers, and various websites will become unconnectable. Maybe they can connect, but this is only because the browser or application accepts downgrades SSL. In addition, there are still millions of old Secure Shell (SSH) servers on the Internet.
OpenSSH seems to have been invaded recently. Although about half of the attacks have nothing to do with SSL, the other half is caused by a vulnerability in SSL. Millions of SSH/OpenSSH sites are still using SSL, although they shouldn't do it at all.
To make matters worse, the terminology used by technology experts is also causing problems. Almost every person in the computer security industry will refer to TLS digital certificates as "SSL certificates," but this is purely a deer: these sites do not use SSL. It's like saying that a bottle of cola is like Coca-Cola, even though this bottle of cola may be another brand. If we need to speed up the world's abandonment of SSL, we need to start calling TLS certificates with this name.
Do not use SSL anymore, and refer to the web server certificate as a TLS certificate. The sooner we get rid of the word "SSL", the sooner it can be swept into the history of garbage.
No.3: Public key encryption
If the implementation of quantum computing and the accompanying cryptography emerge, most of the public key cryptography we use today: RSA, Diffie-Hellman (DH), etc. will soon become readable, which may It will surprise some people. Many people have long believed that the quantum level of usable levels will come again in a few years, but this estimate is blindly optimistic. If researchers really come up with quantum computing technology, most known public key encryption methods, including those popular algorithms, are very easy to crack. Espionage agencies around the world secretly keep secret documents locked up over the years and wait for technological breakthroughs. Moreover, if you believe in some rumors, they have solved this problem and are reading all our secrets.
Some cryptographers, such as Bruce Schneier, have long had questions about the future of quantum cryptography. But critics can't reject this possibility: once it is developed, all ciphertext encrypted with RSA, DF, or even EllipTIc Curve Cryptography (ECC) becomes readable.
This is not to say that there is no quantum-level encryption algorithm. There are indeed some, such as the LatTIce-based encryption, Supersingular Isogeny Key Exchange. But if the public key you use does not belong to this class, once quantum computing begins to spread, your bad luck will come.
No.4: IPsec
If enabled, IPsec protects the integrity and privacy of data transmission between two or more points, that is, encryption. This technology was invented in 1993 and became an open standard in 1995. Hundreds of vendors support IPsec, which is used by millions of computers.
Unlike the other examples mentioned in this article, IPsec is really useful and works well, but it has another problem.
First of all, although it is widely used and deployed, it has never reached a large scale. Second, IPsec is very complex and not all vendors support it. To make matters worse, if one of the communicating parties supports IPsec and the other party does not support it, such as a gateway or load balancer, communication will often be cracked. In many companies, IPSec is often available as an option, and there are very few computers that are forced to use it.
The complexity of IPsec also creates performance issues. Unless you deploy specialized hardware on both sides of the IPsec tunnel, it will significantly slow down all network connections that use it. Therefore, large transactional servers, such as databases and most web servers, cannot support it at all. These two types of servers are precisely where the most important data is stored. If you can't protect most of your data with IPsec, what benefits does it bring?
In addition, although it is a "public" open standard, the implementation of IPsec is usually not shared among vendors, which is another reason to slow down and even prevent IPsec from being widely deployed.
But for IPsec, the real death knell is that HTTPS is widely used. If you enable HTTPS, you no longer need IPsec. This is a choice between the two, and the world has made its decision. HTTPS won. As long as you have a valid TLS e-Cert, a compatible client can use HTTPS: no interaction problems, low complexity. There are some performance impacts, but it is trivial for most users. The world is rapidly becoming a world that uses HTTPS by default. In the process, IPsec will die.
No.5: Firewall
The ubiquitous HTTPS basically declares the end of the traditional firewall. As early as three years ago, people wrote related articles, but in the past three years, firewalls are still everywhere. But what about the real situation? Most of them are unconfigured, and almost all have no "minimum tolerance, default masking" rules, but it is this rule that makes the firewall worthwhile. I believe many people know that most firewall rules are too loose, and even a firewall that allows "all XXX" rules is not rare. The firewall configured in this way is basically not as good as it does not exist. It didn't do anything, just dragging the net and squatting.
Regardless of how you define the firewall, it must include a section: only allow specific, pre-configured interfaces, so that it can be useful. As the world moves toward HTTPS, eventually, all firewalls will have only a few rules left: HTTP, HTTPS, and DNS. Other protocols, such as ads DNS, DHCP, etc., will also start using HTTPS-only. In fact, it's hard to imagine a world that doesn't end with the popular HTTPS. When this is all coming, where is the firewall going?
The primary defense provided by the firewall is to protect vulnerable services from remote attacks. Services with remote vulnerabilities are often extremely vulnerable, and remote exploitation of buffer overflows is the most common form of attack. Take a look at the Robert Morris Internet worm, Code Red, Blaster and SQL Slammer. Can you recall the year when the last world-class buffer overflow worm attack occurred? It should not exceed the first few years of this century. These worms are far less powerful than those of the 1980s and 1990s. Basically, if you don't use unpatched, vulnerable listening services, you don't need a traditional firewall. You don't need it now. Yes, you didn't get it wrong. You don't need a firewall.
Some firewall vendors often advocate that their "advanced" firewalls have features that are far beyond the reach of traditional products. But this so-called "advanced firewall" will only lead to two kinds of results as long as they perform "packet deep inspection" or signature scanning: first, the network speed is greatly reduced, and the returned result is full of false positives; second, only scanning A small part of the attack. Most "advanced" firewalls only scan tens to hundreds of attacks. Today, there are more than 390,000 new types of malware appearing every day, not including those that are hidden from legitimate activities.
Even if the firewalls really achieve the level of protection they claim, they are not really effective. Because the two main types of malicious attacks that companies face today are: unpatched software and social engineering.
Let's just say that whether or not a firewall is equipped is the same. Perhaps they have performed very well in the past, causing hackers to switch to other types of attacks. But for whatever reason, firewalls are almost useless today, and this trend has started since the last decade.
GPON (Gigabit-Capable Passive Optical Networks) technology is the latest generation of broadband passive optical integrated access standards based on the ITU-TG.984.x standard. It has many advantages such as high bandwidth, high efficiency, large coverage, and rich user interfaces. It is regarded by most operators as an ideal technology to realize broadband and comprehensive transformation of access network services. GPON was first proposed by the FSAN organization in September 2002. On this basis, ITU-T completed the formulation of ITU-T G.984.1 and G.984.2 in March 2003, and completed G.984.1 and G.984.2 in February and June 2004. 984.3 standardization. Thus finally formed the standard family of GPON.
GPON is a gigabit passive optical network or Gigabit passive optical network. GPON technology is the latest generation of broadband passive optical integrated access standards based on the ITU-TG. 984.x standard. It has high bandwidth, high efficiency, and large Many advantages such as coverage and rich user interface are regarded by most operators as an ideal technology to realize broadband and comprehensive transformation of access network services.
GPON technology allows operators to provide specific services that their customers need to provide in a targeted manner based on their market potential and specific regulatory environment. GPON technology and equipment are relatively complicated. GPON bears the advantages of multi-service with QoS guarantee and strong OAM capabilities to a large extent at the cost of technology and equipment complexity, which makes the cost of related equipment higher. With the development and large-scale application of GPON technology, the cost of GPON equipment will drop accordingly.
Gpon 1Ge Ont,Gpon 4Ge Ont,Gpon Ont,Gpon Onu,Vlan
Shenzhen GL-COM Technology CO.,LTD. , https://www.szglcom.com