Centralization as a Factor Constraining the Development of Secure Communications
\
The entire history of cryptography, information protection, steganography, and cryptography has been marked by the antagonism of two sides—attackers and defenders of some form of information being transmitted through one or more communication lines. At certain times, attackers had the upper hand when all systems (globally) became completely vulnerable. At other times, the defenders prevailed, discovering new, more effective methods of information protection. In any case, the attackers initiated subsequent processes, finding vulnerabilities and shortcomings in certain schemes and exploiting them to obtain the necessary information.
\
Throughout this history, attackers played a dual role—destruction and creation. In the short term (relative to the entire history), their actions caused despair among defenders, who realized the futility and hopelessness of imposed security. On the other hand, as defenders understood attack vectors, they created better systems that countered old attack methods. Thus, the entire history of information security was a unity of opposites, transparently presented.
\
Today, in the context of cryptography as the primary and basic information protection method, it can be confidently said that defenders have outpaced attackers (excluding the development of quantum computers). In the second half of the 20th century, cryptography transitioned from an art form (its classical form) to a full-fledged science (modern cryptography) thanks to the works of Claude Shannon, the standardization of DES encryption, the discovery of asymmetric cryptography, hash functions, and digital signatures. These developments positively impacted the redesign and improvement of security systems, leading to the emergence of cryptographic protocols for a wide range of specific and general tasks. Today, there are algorithms and protocols like AES, ChaCha20, RSA, Diffie-Hellman, Elgamal, SHA256/512, Keccak, etc., for which effective hacking methods have not been found, even after decades of open cryptanalysis with inherent rewards.
\
At first glance, this information seems favorable for information security, but it is, in fact, misleading because attackers have shifted their attack vectors by blending with defenders. The smooth flow of competition and convergence between attackers and defenders ceased when mass media and computer technology merged. Attackers no longer need direct hacking (at least in the civilian sector), as was previously required. Attackers have now divided into two camps: some continue to oppose increasingly sophisticated information security methods through cryptanalysis and the development of quantum technologies, while others have opted to exploit mass media by merging with its protectors, forming an entirely new entity. This entity simultaneously protects and attacks the same people, no longer improving security measures when vulnerabilities are discovered, as it benefits from the insecurity itself.
\
These attackers (or defenders) are communication services (social networks, forums, messengers, etc.), whose information flow exceeds all other types of communication. Protecting client information is driven by the need to keep it from other services. Attacking this information is driven by the need to sell it to other services or hand it over to government agencies. Thus, communication services paradoxically perform two entirely different functions. The main issue with this is that outdated threat models in information security still focus on new or old cryptanalytic attacks and the development of quantum computers, which have minimal effect or will only matter in the future. Currently, we are dealing with a much more specific form of attack that continues to function covertly.
\
In modern realities, as society increasingly operates in virtual communication spaces, the lack of real security for confidential information and anonymity is becoming more noticeable. Companies, corporations, and governments are all trying to gather as much diverse information about individuals as possible:
\
gender
weight
age
financial status
country
city
street of residence
political views
clothing choices
relationships
friends
relatives
phone number
email address
biometric data
passport information
device used to access the internet
interests
hobbies
education, etc.
\
This amalgamation of data, connected solely by the individual it pertains to, becomes invaluable information, representing human capital, whose distinctive feature is the reproduction of consumption. The next logical step for the "collector" of this information is to sell it to third parties for economic gain and influence. When monopolization or cartel agreements between such "collectors" occur, it becomes possible to exert political influence, aimed primarily at suppressing competition and expanding the system, as well as maintaining established imperatives.
Stages of Network Communication Development
\
Decentralization, as the initial form of Internet communication, arose from academic research, leading to the global development of information technologies. This system not only represented external progress but also an inherent evolution, revealing negative aspects and internal contradictions in its implementation. The factor contributing to both its growth and downfall was the scalability issue. The inability to establish wide-scale connections led to the need for intermediary nodes and concentrated communication lines, thus giving rise to the core of centralization, which marked the starting point of future challenges.
\
Centralization, as the second stage in the development of Internet communications, emerged from the decomposition and decline of the initial decentralized system. While it offered scalability, centralization went through internal development stages, layering abstractions and negating decentralization, paradoxically becoming the final phase of its own evolution. With each iteration of its progress, the centralized system scaled further, entrenched itself deeper, and increasingly represented itself, forming second-order simulacra.
\
Simultaneously, the system neutralized external attacks that had previously been harmful but were now harmless to its functioning, such as service attacks (DDoS) or vulnerability exploitation for internal information extraction. Over time, the system's continuous growth created a society increasingly disconnected from its original mechanisms, more dogmatic and fragmented. The initiator of the system became its observer, and the system became a reproduction of observers. Eventually, the structure initiated its own internal interests, inversely directed at users, fundamentally changing its interaction with them. Under this imperative, the system began creating third-order simulacra aimed at downplaying or hiding the true level of security, replacing reality with illusion within its abstract layers. The result of these false representations was a "security theater", aimed at maintaining the current order (the system), concealing the true level of confidentiality.
\
While external threats to information security became completely ineffective against the centralized system due to its evolution, this does not mean the absence of internal threats. Scalability itself generates internal threats, creating contradictions within the system and leading to its eventual downfall. As the system expands, concentrates connections, and monopolizes, internal actors find it profitable to sell user information. Governments benefit from concentrating communication lines into singular spaces for control, and advertisers find it profitable to invest in mass systems with algorithms that serve ads based on confidential user information, increasing their profits. The system itself cannot solve this issue of information security in centralized systems, as its foundation is based on scalability and representation. Hence, the survival of the centralized system depends on the number of abstraction layers and copies without originals.
\
Hybridization, as the third form of Internet communication development, negates centralization while synthesizing it with decentralization. By retaining scalability but rejecting the internal development of centralization, it synthesizes the external growth of decentralization, offering transparent proof of functioning without layers of abstractions and third-order simulacra. This system is more resilient to both internal and external attacks. There is no longer an internal employee leaking information; the government cannot effectively gather information, and advertisers no longer find it profitable to invest. However, this progress comes with a relative regression, as the system's viability depends on participants who maintain it, such as enthusiasts, volunteers, or nodes that can receive profit from donations or an internal mechanism (cryptocurrency). In any case, these systems lack consistent funding, and centralized systems (including the state itself) become hostile to their existence. The combination of centralization and hostility towards it are key factors in the contradiction and decomposition of hybrid systems through future division, separation, and improvement.
\
Decentralization, as the fourth form of Internet communication development, becomes a scalable and secure user environment. The issues of hybridization are no longer present, as the system cannot be centralized due to its fully rhizomatic nature, which denies hierarchical structures. Every user ultimately becomes the embodiment of the system, its participant, and its form of support. At this stage, information security evolves, moving towards a higher level of security for its subjects. Decentralized systems eliminate the initial flaws of the early forms and ultimately bring about the negation of previous iterations.
Deterrent Factor in Communication Development
\
At present, the leading form of expressing network communications is entering its second stage of development. A centralized shell is the most enduring medium, as it absorbs the most contradictions, paradoxically combining them successfully. The complexity of these connections postpones their eventual unraveling by creating alternative solutions. Indeed, both the previous and subsequent systems represent certain primitives, each with its own advantages and disadvantages, but more importantly, with no internal opposition within the system itself.
\
==Unlike other systems, centralized ones clearly exhibit two differentiated interests: on one side are the service providers, and on the other, the users of the system. The former benefit from this paradigm because they control all the information passing through and stored within them. This is advantageous not only from an economic perspective (advertising, selling confidential data, bribes, etc.) but also from a political one (state propaganda, blocking opposition views, blackmail, lobbying, etc.). The influence of these providers, like a shadow, extends over the users of such services, gradually turning them into mere objects of market research.==
\
For users, the system's appeal lies in its ease of use, good connection quality, ample storage, and user-friendly interface. From an external perspective, this situation might seem symbiotic: service providers build the entire infrastructure for clients in pursuit of future economic and/or political influence, while users adopt the system to comfortably interact with others. However, over time, this symbiosis becomes a form of parasitism as the growth of the user base shifts the system's purpose—from serving the users to exploiting them. Users lose sight of the causal link between the system's existence and their participation, becoming unaware of the massive data collection taking place. As a result, users become the only counterforce to communication services, the only entity capable of dismantling the system from within. If these users can not only recognize their interests but also successfully transfer them to alternative systems that align with the interests of the majority, centralized mechanisms will gradually be replaced by hybrid, decentralized alternatives.
\
At present, we are witnessing the rise of alternative systems. Hybrid systems are becoming more widely used (Bitcoin, Tor), and in some aspects, decentralized systems have even surpassed centralized forms in efficiency, as seen with the BitTorrent file-sharing protocol. These developments should, in theory, hasten the decline of centralization, but this is not happening in practice. Centralization persists due to its inherent longevity, which is sustained by the following factors.
\
Firstly, the clear interests of one group (profit, control) and the abstract interests of the other (communication, information search) lead the latter group to engage only in passive resistance, resulting in no decisive changes. However, this contradiction is key because it slowly drives the development of alternative solutions. A notable example is the public disclosure of the PRISM project, which sparked global discontent without yielding any concrete results. Monopoly corporations continue to collaborate with state apparatuses.
Secondly, the comfort of using centralized services overshadows security concerns, as most users are more likely to choose a fast system over a secure but slow one.
Thirdly, centralized services often give the illusion of security without genuinely striving for it. Centralized systems cannot enhance security significantly because doing so would contradict the second point—convenience.
Fourthly, centralized systems have little incentive to improve actual security, as the political influence they gain from their economic power allows them to absorb penalties for data breaches, which often cost less than hiring cybersecurity experts.
Fifthly, centralized systems inherently gravitate toward monopoly by concentrating connections. As a result, many services, both explicitly and implicitly, begin to merge, expanding and successfully suppressing alternative systems—whether hybrid, decentralized, or small-scale centralized ones. In some cases, antitrust bodies, themselves products of centralized mechanisms, rarely oppose monopolies in practice.
Sixthly, the economic basis of centralized systems prevents them from breaking free from the existing paradigm, as centralization is a byproduct of the economic necessity of managing resources, including human ones. Breaking this paradigm would inevitably lead to bankruptcy and subsequent absorption by another, more successful, centralized system.
Seventhly, the centralized form is more flexible in developing new communication technologies because it disregards client security and has access to all necessary user information. These characteristics allow centralized systems to develop new solutions more quickly, outpacing alternative systems by several steps.
Eighthly, decentralized systems are subject to "corrosion" by centralized forms. This occurs because decentralized systems strive for faster, more efficient connections by establishing stable nodes, which leads to the concentration of subsequent connections and the relative regression of rhizomatic elements.
\
Thus, the development of post-centralized network communications is a matter for the distant future. With each passing day, contradictions accumulate, which plays a dual role. On one hand, these contradictions push the system toward its own demise by revealing flaws that must be addressed. On the other hand, the sheer number of contradictions becomes a deterrent, requiring more time to analyze and resolve the system's components. In any case, on the decaying, decomposing, and self-restoring ground of the old system, small sprouts of future network communications can already be seen. These will be capable of providing a real, rather than fictitious, level of security, protecting users' personal and confidential information.
Technical Description of the Problem
\
When addressing security issues in communication channels that use cryptographic protocols involving participants A and B, as well as a trusted third party T, the focus is often on the latter. This is logical because the trusted intermediary T becomes a "legitimate" attacker in the eyes of participants A and B, capable of performing man-in-the-middle (MITM) attacks and putting the system in an unstable state that requires absolute trust. This attack is based on the unresolved problem of trust, which is destructive by nature, but overshadowed by a more hidden and even more destructive issue.
\
The possibility of an attack by the receiving entity arises from the use of cryptographic protocols designed to protect "client-server" communication, where the server is seen as the recipient of information and the client as the sender. However, in most cases, the server is not the true recipient, but merely an intermediary node connecting two or more clients, forming a new type of "client-client" communication, which is completely ignored by cryptographic protocols. This problem is critical to the very foundation of computer networks because it exposes all information (interests, messages, contact details, political views, etc.) in an open, transparent state to the intermediary node. A vivid example of this phenomenon is seen in modern messengers, social networks, forums, chats, file services, etc., where communication does not happen directly (as assumed by cryptographic protocols), but always passes through a third-party service or platform.
\
This situation is undergoing significant changes, as it revives the fundamental problem and challenge of classical cryptography—the fight against eavesdropping, which was theoretically solved with the advent of asymmetric cryptography. This problem is far more serious than the classical MITM attack, requiring far fewer resources for the attacker to monitor a larger number of targets. This is the panopticon of modern society, where attackers and victims switch roles, inverting the method of surveillance and turning the victim into the initiator of their own surveillance.
\
Victims voluntarily connect to inherently compromised channels, choosing from multiple tracking options for their "self," while attackers simply create the necessary connections and surveillance platforms to overshadow the existence of more secure alternatives. As a result, the confidentiality of modern services becomes mere decoration—a theater of security, a simulacrum that references a non-existent, hypostatized security, used as a marketing buzzword. Simultaneously, the convenience of these services becomes the foundation, philosophy, and propaganda that gradually replaces security, much like the parasitic "Cymothoa exigua."
\
This evolution leads to the rise of trust-based systems, where not only trusted nodes become attackers, but also intermediary recipients, resulting in significant risks of compromising stored and transmitted data between genuine participants. As the system evolves, it starts to support implicit connections between heterogeneous communication platforms, duplicating information across multiple platforms for data collection, marketing, and targeted advertising. Consequently, all the aforementioned factors lead to a clear violation of user privacy, with certain de-anonymizing consequences.
\
However, it is impossible to completely eradicate such a trust-based system due to the real degradation of software performance, architectural challenges, and the inherent inability to fully eliminate trust. Therefore, it is necessary not to destroy, but to replace this system with a more secure one, relegating it to a niche where it remains useful. In all other cases, new systems must be built, aiming to reduce the power of trust, where the very structure of the system protects objects and anonymizes subjects. Systems of this nature already include anonymous networks, client-secure applications, and secret communication channels.
What is the Problem of trust?
It is the inability to create a secure, monolithic, and self-expanding system based entirely on cryptographic algorithms for end participants, without using intermediary nodes to verify the identity of participants or without side communication channels with pre-established trust. This problem arises from the complexity of transmitting public keys. In decentralized, rhizomatic systems, this problem is even more significant, as it leaves only the option of using side communication channels, i.e., direct trust, through which a trust network can already be formed.
What is the Power of trust?
It is the number of nodes involved in storing or transmitting information, presented to them in an open format. In other words, these nodes can read, alter, and modify the information since it is in a completely transparent state for them. The greater the power of trust, the higher the probability of compromising individual nodes, and thus the stored data. One such node is usually the recipient. Thus, zero power of trust means the absence of any interaction with the information provided. This is virtually impossible in client-server interactions. However, this state is very close to ideal in client-client interactions. Although it can only be considered approximate because key distribution channels and transport protocols must still be trusted, this state is nonetheless necessary.
Welcome to Billionaire Club Co LLC, your gateway to a brand-new social media experience! Sign up today and dive into over 10,000 fresh daily articles and videos curated just for your enjoyment. Enjoy the ad free experience, unlimited content interactions, and get that coveted blue check verification—all for just $1 a month!
Account Frozen
Your account is frozen. You can still view content but cannot interact with it.
Please go to your settings to update your account status.
Open Profile Settings