bill's blog

Just another WordPress weblog

Browsing Posts tagged ones and zeros

Anyone whose made a serious living at administering computers has heard of the OSI model (or the 7 layer model). It is a networking framework that allows for data to be passed between networks note in such an efficient manner. At the bottom of the model is the hardware layers… It is these layers that actually do the heavy lifting. It is these layers (the data link and physical) that allow for the ones and zeros to be passed between computers. Further up the model are the network, transport and session layers that are ‘logically’ involved with moving the data back and forth. Lastly at the top of the model are the presentation and application layers. The are more protocol (FTP, HTTP, SSH, etc.) specific. Yes I know I’m oversimplifying this!

The OSI model was formulated so that different vendors could concentrate on their areas of expertise and still allow for compatibility. For example, if a vendor is building a network switch it would only really need to deal with layer 1, and 2 (and sometimes 3). Layers 4 and 5 can be considered the network stack. This is usually the job of the OS manufacturer. While vendors that are creating software (such as an FTP client) would only have to deal with upper layers 6, and 7. Yes again I’m oversimplifying this! A really good explanation of what is happening at which layer can be found at

Data encapsulation is the process of enclosing higher-level protocol information in lower-level protocol information. So in really basic terms, think of it this way… I have a file that I need to get to my profession at school. Using an FTP client I’m taking advantage of Layers 6 & 7. The FTP client then encapsulates the information (the file) and passes it to the operating system which determines how the and where the file should go. So for example since we are transferring a file, we’re going to use TCP to ensure that our file gets to the destination intact. The networking stack of the OS encapsulates the source and destination information and passes the “packet” to the hardware layers. The hardware layer works to pass the ones and zeros that make up my file to the destination host. Once the packets are received on the other side… the layers are stripped off so that ultimately the person on the other side is left with the original file I sent. Which is all I’m really concerned about… getting that file to my professor and him having the ability to read it.

This all being said… the big advantage of data encapsulation is it allows vendors to concentrate on their areas of expertise… software development… hardware development… knowing that vendors specializing in the layers above and below will concentrate of their areas of expertise. The big disadvantage is that it adds overhead! Additionally, not all vendors are equally capable. This could lead to problems in the way information gets from point A to point B.

Computers and science fiction are intrinsically bound at the hip! And no one individual ties the both together than Star Trek’s Mr. Spock! Spock could be seen in most episodes working at his computer workstation fine-tuning the results of a search, calculating odds or presenting definitive course of action. But it wasn’t Spock’s love of computers that made him so special… It was his impeccable logic! SO sound was his logic that Kirk would go on to say, “You’d make a splendid computer, Mr. Spock” (Roddenberry, 1967).

We as human beings often think with emotion rather than logic. Thinking with emotion clouds logical thought. In IT the ability to think logically about a problem is a must… ones and zeros. It helps with the reasoning process… “I understand that your computer seems slow but can you be more precise?” If we can eliminate subjectiveness, we can often get at the root of the problem much more expeditiously. But logic isn’t only used to troubleshoot software bugs. Logic comes in handy for project management concerns as well.

We are constantly moving solutions into and out of the organizations we work for. Returning machines on lease seems pretty benign. We buy machines… they get delivered… we image them… we deploy them to the end-users desktop. One needs to be worried about interrupting the user. We don’t want to incur additional costs because we can’t turn around the number of machines ordered. It takes a lot of planning. The more you touch a piece of hardware the more time it takes to deploy… the better your chances of messing up! Understanding how to stage the machines and being able to be flexible to change needs to be a part of your logic.

Technology data migrations are another place where logic plays a hand. The more complex a migration is the more logic needs to be applied for a successful outcome. One needs to be able to determine the order in which changes happen. Formatting out a hard drive before you move the data off would be a really bad thing. Does the users home directory reside on the server or is it cached locally on their laptop? When was the last time the data was synced? These are just some of the questions you need to adequately plan. It is logic that you use to formulate the best way to make things happen.

Common sense… plays a part here too. The most common meaning to the phrase is good sense and sound judgment in practical matters (Wikipedia, 2010). It is this judgment that when strung together makes our logic sound as well! Some may Logic does not come naturally. Just like our reasoning skills logic needs to be learned. The study of logic enables us to communicate effectively, make more convincing arguments, and develop patterns of reasoning for decision making (Angel, 2007). The more you exercise your logical thinking the better you become at it.


Angel, A., Abbott, C., & Runde, D., (2007), A Survey of Mathematics with Applications, Pearson/Addison Wesley

Roddenberry, G., (1967, February 9), Star Trek [The Return of the Archons], New York: National Broadcasting Company.

Various, (2010, April 20th), Common sense retrieved on April 21, 2010 from

Computer data is physically nothing more than ones and zeros; yet the information that those ones and zeros represent can prove to be vastly important. On a very personal level it could represent our life’s saving in a QFD (Intuit Quicken) file or it could be something a little more dramatic such as the design plans of a Blackhawk helicopter! Either way we wouldn’t want to let the information get out into the wrong hands. There are many ways to protect our data, certainly in the case of the Quicken data file, Intuit allows for password protecting the file. Microsoft Office files and Adobe PDFs both have their own password protection schemes. BUT is your data truly safe? In the case of the later two… It’s a fairly trivial task to crack the passwords. So what’s a person to do? Well you could always hide things in plain sight using any number of steganographic tools! BUT all you’re really doing is hiding your data in much the same way a pirates burying their booty! No… want we want (and many governmental agencies need… HELLO VA!) is whole disk encryption. There are many companies that provide encryption scheme for the boot partition… enter a password and boot your computer. This type of protection can get a bit expensive and problematic from an IT management perspective. In fact we really don’t need to encrypt the entire disk… in actuality… we only need to encrypt the partition that contains our data. And for that we don’t need to spend a lot of money! Enter Truecrypt.

Truecrypt is an open source, cross platform disk encryption tool. You can use it to create encrypted files. It will even do traditional boot disk encryption of a Windows partition! But as I mentioned earlier we’re looking to just encrypt a single partition that houses our important data. Truecrypt uses AES-256, Serpent, and Twofish encryption algorithms and it provides plausible deniability! During the Iran-Contra Hearings, Senator Sam Nunn (D-Georgia) provided a perfect definition for plausible deniability…

Everybody I’ve talked to in the intelligence community and around town . . . tells me that the definition of that term is that when you set up plausible deniability for someone . . . they know the facts in question, but they can deny the knowledge, and that the denial is believable.” (Schwartz, 1987)

WOW it doesn’t get any better than that! SO how do we use this tool! First you can download the application from Once downloaded the first thing I would do is make sure that I indeed downloaded the correct software by validating the PGP key provided by the developers! We’re talking about protecting your trusted data… Take the extra step!

Install the application… Double-click to launch the executable!

We want to encrypt a USB thumb drive with a hidden volume… The default window should look similar to this. 

Click on Create Volume. You’ll be prompted through a bunch of questions. In our case select because we are encrypting an entire USB thumb drive we should be selecting…

Next select because we want plausible deniability select the second option… If it was good enough for Ollie North it’s good enough for me!

You’ll next be asked to select a disk to encrypt. You will be asked to provide the password of an administrator of the system you are working on. This is needed because Truecrypt will eventually be formatting out the disk and this requires administrative permissions.

Select the Encryption and HASH algorithms you prefer…

Select OK and Truecrypt will begin the process of encrypting your thumb drive. This could take some time… In the case of a 2GB thumb drive, this took about 15 minutes.

The one gotcha is that you will need to populate the outer volumes with files that look important NOW! We do this so that if you are forced to compromise the password… when “they” unlock the drive and it will look as if they got what they want. So make those files look good without giving away the farm!

After the process has finished, you will be prompted to create the hidden volume.

Creating the hidden volume is very much similar to the outer volume! You’ll be prompted again to select which encryption and hash algorithms you prefer to use on your hidden partition. Next you’ll be asked how much space to allocate to your hidden partition… In my case I chose to allocate 3/4 of the space in half!

You’ll be asked to select a file system for the hidden volume. In my case I chose FAT as this gaves me the most options with regard to the OSs I can use the thumb drive with!

When the process is finally completed you’ll be presented with the following disclaimer…

Congratulations… You’ve just created you encrypted plausibly deniable USB thumb drive!


Schwartz, J., (1987, July 22), PLAUSIBLE DENIABILITY Series: The Iran-Contra Hearings: The Tenth Week of Testimony, The Washington Post

WOW… Where to start… Hard drives are the garbage dump of a computer… Sure we strive to keep our data organized but in actuality… We have zero control as to where the computer places our data on disk. Files are written to the first available sector on disk. These sectors are reversed and freed based on which files are in “use” and which have been “deleted”! In actuality no files are truly deleted until they are overwritten. Point of fact… the pointer to the file on disk is the only thing that is deleted when we empty the Trash/Recycle Bin.

A bit-stream copy of a hard disc is a more exact duplicate as to the ones and zeros on a disk. One needs to have an HD of equal or larger size than the one being copied… Some may call this a disadvantage BUT the fact of the matter is that disc is cheap. The fact is that disc size grows while the cost remains fairly constant. No real disadvantage there.

It takes disc of equal size because it includes the file/disc slack. Why is this important? Because disc storage is broken up into blocks. These blocks are finite on disc based on the file system of the OS/disc that is operating upon the disc. If the block size is 8KB and you actual file/data sizes is only 4KB…that leave 4KB of free unallocated space. There are tools that can right data to the slack space. Tricky… tricky they are. You want to be able to capture everything that is on disk… No matter what.

Because Bit-stream copies are capturing every byte of data on disk it takes longer to copy. Standard backups/mirror images are only copying the actual data and then fitting it into it block size allocation on the destination disc. One would miss the slack space… AND the “deleted” files! Bad idea.