TLS. IPv6. IEEE. RFC. IMAP. PPP. UUCP.

These are just a few examples of acronyms from the world of network protocols and standards. Some are abbreviations for protocol names. Others represent common terms or organizations in our niche.

Ever wonder who comes up with acronyms like these, how they affect networking, and how you can have a say in the development of protocols and standards? Keep reading for an overview of this vitally important part of the networking ecosystem.

What’s a network protocol?

Put simply, a protocol defines how two devices on a network communicate with one another. Just as a conversation between two people works only if they speak a common language, computers also need to agree on how to send and receive data before they can be networked together effectively.

Network protocols are also similar to natural human languages in that they have three basic components: syntax, semantics, and timing.

Syntax defines how data will be structured, in other words, the order in which pieces of information will be packaged by the sender and opened up by the receiver.

Semantics determine what individual pieces of information within a network protocol mean. They allow the sender and receiver of information to interpret the pieces correctly, depending on where in the stream of data they appear.

Timing definitions govern how quickly data can be sent and received, as well as when it should be sent. To work effectively, a protocol needs to ensure that both the sender and the receiver are prepared to communicate with one another at the right time, and that one isn’t sending data at speeds too high or low for the other.

There are myriad protocols in existence. Most of them define how to share specific types of data, such as when someone opens a Web page, accesses a networked file system, or sends an email.

What’s a standard?

Agreeing to common syntax, semantics, and timing definitions for a protocol is easy enough if you’re dealing only with other computers in the same office or town, or if all parties are using the same hardware and software. But how do you ensure the whole world sticks to the same conventions within a protocol? That’s where standards come in.

Standards are guidelines that explain to all IT stakeholders — from device manufacturers to software programmers and network administrators — how a particular protocol should operate. As long as everyone adheres to a common standard, and provided the definitions of that standard are open to the public, the protocol guarantees two devices can communicate, even if they were built by different companies or are running different operating systems.

confusing traffic light - network protocols and standatds

Hardware and software are different beasts

On that note, it’s important to understand that protocols and standards aren’t software code or hardware design blueprints. Instead, they simply explain what software and hardware should do in order to send and receive information.

It’s up to developers and engineers to decide how to implement the functionality defined by a protocol. There’s usually more than one way to do that.

You can think of it as the difference between apple pie as a type of dessert, and your grandmother’s apple pie. The former is like a protocol or standard because it refers to a dessert that can be made in any number of ways, yet always has the same basic characteristics no matter who produces it. The latter is a specific implementation of that dessert made from a particular recipe.

Some implementations of a protocol may be faster, more secure, and less buggy than others. But as long as they all adhere to the same basic standards, they’ll be mutually compatible.

What are open standards?

Not all standards are open. Sometimes, hardware or software companies try to implement closed or proprietary protocols that work only with their own products.

Remember the 1990s, when many websites worked well only in particular browsers? A lack of open protocols and standards was the reason.

Sometimes closed standards can help a company by giving it an edge in the market. Theoretically, closed standards can make data transfers more secure by concealing the inner workings from hackers looking for a flaw to exploit.

But in general, closed standards rarely serve the interests of the IT community as a whole. It’s usually better to implement open standards that everyone can use. That makes innovation and interoperability easier.

After all, the Internet as we know it today wouldn’t exist without a common set of open network protocols (like HTTP, FTP, and IMAP) that everyone uses when serving Web pages, uploading files, or sending email — no matter which operating system they run or type of devices they work with. Open protocols are baked into modern networking.

Who makes open standards and protocols?

In some cases, standards develop in a de facto fashion. That means they’re implemented in a decentralized way by a community of interested stakeholders, and adopted on a large enough scale to become universal.

More often, however, protocols and standards come into widespread use through the work of standards bodies. These are organizations that oversee the creation of protocols and standards and officially approve their adoption, giving them de jure status.

Standards bodies usually can’t compel anyone to adopt their standards — you won’t face jail time for failing to implement the right version of NFS, for example — but by encouraging everyone to follow a common set of guidelines, they make interoperability much simpler.

In the networking world, the most important standards bodies include the International Organization for Standardization (ISO), the ITU Telecommunication Standardization Sector (ITU-T), the American National Standards Institute (ANSI) and the Institute of Electrical and Electronics Engineers (IEEE).

What’s an RFC?

Standards bodies aren’t dictatorial organizations that impose protocols from the top down. Instead, community members get to play a role in creating standards by submitting Requests for Comment, or RFCs.

RFCs, which originated in the early days of the Internet, provide opportunities for individuals or groups to propose a new standard or changes to an existing one. After submission, they’re published in the official RFC Database, where other people can review them.

RFCs exist in several maturity levels. They begin as proposals for new standards, then evolve into drafts. If approved, they become official standards. Experimental, historical, and informational RFCs also exist, although none of these varieties has a bearing on standards in current use.

And that’s the 411. Stay tuned for a future post where we debate a thorny question now emerging in the industry: Is it time to get rid of standards bodies?