Difference between revision 1 and current revision

Summary: It seems to me, We have already lost our privacy.

No diff available.

For a long time now I have been waiting for a Corporation or a Government to be sued for privacy breaches. We all know that certain businesses and governments ignore our privacy and collect data about us in a most intrusive and illegal manner.

Even though most countries have privacy laws in place. The governments ignore the issue of the internet and make no real laws to protect us from spyware and other monitoring malware. Even though webpages have been available to the public since the early 1990’s nothing has really been done to protect us.

Meanwhile, Governments all over the world are busilly installing CCTV to monitor their populations. So images of you and things you say to people are recorded and monitored without your knowledge.

It seems to me, We have already lost our privacy. I think the privacy laws were setup just to force us to fully identify ourselves to governments or corporations. This way they can withold services easily. This way people are less likely to complain. (Because they always know exactly who you are) The process of proving your identity also acts as a means of revenue raising for governments. Current privacy laws mean to me that I have to supply ALL my information. Over and Over. Increasing the risk that I myself will become a victim of ID theft.

I would like to see the privacy laws overhauled and updated to at least provide some protection on the internet. Its a disgrace that this obvious breach in the publics’ privacy has yet to be addressed.

2014-12-14 Privacy

I Have Something to Hide BY RAINCOASTER

DECEMBER 5, 2014

This is a reprint of the May, 2013 post of the same title from the Neurovagrant blog of Ian Campbell. He kindly permitted us to repost this crystal clear lesson in why encryption is not just an option, but a moral imperative.

In the debate about NSA surveillance, any surveillance, in the debate about any action government and especially law enforcement may take, the oft-repeated party line goes like this: “I’m not concerned about it because I have nothing to hide.”

This argument fails on a number of levels. The most basic level is that it assumes we each possess perfect information. Perfect information is a concept in two fields that I follow closely: Game Theory, and Economics. Game Theorists study rational strategic decision-making by examining mathematical models of games and how players interact.
In game theory a player is said to have perfect information when they possess “the same information to determine all of the possible games (all combinations of legal moves) as would be available at the end of the game.” Chess can be a game of perfect information since all the pieces are on the board throughout the game and all the rules are known ahead of time. Even then, though, most humans don’t possess the cognitive processing paths allowing them to treat chess as a game of perfect information. We’re simply not primed or trained to see all those possible moves from all sides.

A better game to think of in the context of perfect information is tic-tac-toe. Nine squares, two pieces (X’s and O’s), known rules, and much easier for us to process. Processing information (legal moves) in games is best described through using a decision tree (graphical tool where every option spawns a new branch of the tree) or a decision matrix (rows and columns of values that quantize relationships, such as those between choices in a game). The decision tree for tic-tac-toe is a lot more simple than chess since the latter involves sixteen game pieces and sixty-four squares (this is the main reason why it’s a lot easier to teach a computer how to play tic-tac-toe than chess).

In either game you’ve got perfect information if you can fill out the entire decision tree from start to finish. All the possible moves by all players.

Let’s consider a new game to model. It’s a lot more complex. It’s called Being A Citizen.

Before saying “I have nothing to hide” I’d have to say that I possessed perfect information in the context of making that decision. That’s perfect information not only about every past move leading up to this decision but every future move after it. It assumes that all “pieces” are above the board and that I know all the rules to this game. And that’s demonstrably incorrect.

Let’s take the assets and programs of the National Security Agency as some of our game pieces. For them to be above the board we’d need the government to be both honest and accountable about them.
Instead, NSA Director Keith Alexander has repeatedly lied to the public about every aspect possible. So has Director of National Intelligence James Clapper. They’ve lied to us as individual players and Congress as what we might call a Superplayer; about buildings, assets, programs, collected materials. Everything we’d need to get a good idea, no less a complete idea, about the pieces on the playing board.

Having established that we don’t have clear information on the game pieces, let’s turn to the playing board. In order to play chess you’ve got to abide by certain rules, but there’s a trade-off: the rules are all made plain beforehand. You’re not going to get midway through the game and then be challenged about the legality of your opening move, either due to a rule that was hidden from you or due to a new interpretation of an old rule. But in the game model we’re dealing with here, government in general and intelligence agencies in particular have established exactly this possibility. As one example: the very court opinions and administration interpretations of the Patriot Act allowing the government to order telecommunications companies to collect and provide massive amounts of data on US citizens are secret.

The Foreign Intelligence Services Court approved nearly nineteen thousand search/eavesdropping warrants from 1979 until 2004, while rejecting just four. And their proceedings are entirely sealed and secret from us. Unless, of course, leaking FISA information benefits the Government player. And then it suddenly appears. This, by the by, is what’s called information asymmetry. It takes place in asymmetric games, games in which strategies are not the same for each player but dictated by the power imbalance between players. Remember this concept, it’s important.

At this point we need to remember the structure of the NSA’s information-gathering programs. They’re largely not set up for distributed, real-time analysis of communications. They’re erected for investigative purposes, connecting the dots. Going back into records of previous events as far back as the records go. Which means that once you seemingly violate a rule that you’re not aware of, or once the administration alters its interpretation of the rule to make you a violator, they can now go back through every communication within their grasp and piece it together in any way they desire in order to make you appear guilty as sin.

Without you knowing, at any step of the process.

“But Ian,” you’re about to argue, “of course D-NSA Alexander and DNI Clapper lied to the public. FISA’s secret. They had to. It’s classified. Surely you didn’t expect them to expose their own secret programs?”. No, I didn’t. I expect secrecy and confidential programs in government; I’d go so far as saying that secrecy is absolutely essential in some areas of government. Arguments about ending secrecy are naive from the outset. Abolishing secrecy isn’t the point.

The point is this: playing a game (read: making decisions) as if I have perfect information when I don’t manifests an inherently flawed strategy. This isn’t about what I expect of Alexander or Clapper, but what they expect from me in adopting “It’s okay because I have nothing to hide.” It presupposes that my interests and those of the government always lie in the same direction. That I know each strategy the government may take, every branch of their decision tree, that the government’s being straight with me, and that it has and will always have my individual interests at heart. Out of these three conditions, the first is ludicrous, the second is (again) immediately demonstrably false, and the third is false in nearly every lesson we’ve seen in history.

The interests of individual and government always have places of divergence, generally because government is full of other individuals all making strategic decisions in the interests of themselves and their ideologies. Our ability to compromise in places is what allows us to form governments. And compromise, while not inherently harmful, often involves finding common ground in the spaces between our original interests. Even moreso when it’s done on a macro, societal scale with the potential to criminalize peaceful protests (like many Occupy sites), pass legislation that potentially criminalizes miscarriage, restricts a person’s right over being secure in their own biological functions, refuses equitable rights to people of different sexual orientation or race or religion or levels extra scrutiny on the tax status of organizations of a particular political persuasion.

“I have nothing to hide” means you’re playing an asymmetric information game like other players would want you to: poorly. Out of some mythical principle you’ve chosen to tie both hands behind your back in order to play a game that the intelligence agencies won’t even tell you the rules to.
This is a game you will lose every time. Because not only do other players have more information than you, they also have just about all the power in the situation. And remember what I said above: strategy in asymmetric games is dictated by power imbalance between the players. Relinquishing both your power and your information is not a strategy, it’s a suicide. A strategy is, say, aligning with other players cooperatively to combine your power, such as in protest. Or securing your own information, as in encrypting your data and anonymizing your internet usage.

We know what happens to protesters: they’re investigated, infiltrated, marginalized and criminalized. They face felony charges and thirteen years in prison for marking the sidewalk with water-soluble chalk (this last, thankfully, acquitted by a jury this week). And now leaked NSA guidelines reveal what happens to the other side of your strategy as well: using cryptographic and anonymizing technologies increase chances that the NSA will not only scrutinize you further, but also keep your data in contravention of law.

In other words, when you pursue a rational strategy that harms no one, it’s used against you.

Just how do you think the NSA is approaching this game? To move this from game theory back into common terms: Just how do you think the NSA is approaching this decision-making process?

With your interests in mind?

So yes, I’m going to encrypt my data. I’m going to use Tor when I browse, I’m even going to order an Onion Pi and switch all my traffic over to Tor. I may be a very solid part of the surveillance state, being a police dispatcher for nearly a decade now. But I have something to hide: my communications, my traffic, my likes and dislikes, my entire online identity in some senses. I have something to hide not because I’m a bad person (I’m not) or because we live in a totalitarian state (we don’t) but because I don’t have perfect information and this game isn’t being played fairly.


Comments on 2014-12-14 Privacy

Last edit

No diff available.

0 Add Comment

2014-09-15 Privacy

The New South Wales police have used sophisticated hacking software to monitor the phones and computers of Australians, according to documents published by WikiLeaks.

In a new cache published on Monday NSW police are listed as a client of Gamma International, a German company that develops powerful spyware to remotely monitor computer use.
The documents show that NSW police have used several of the company’s spy programs for a number of investigations at a cost of more than $2m.

The software – known as FinSpy (or FinFisher) – allows widespread access to computer records, including extracting files from hard drives, grabbing images of computer screens, full Skype monitoring, logging keystrokes and monitoring email and chat communications.

“When FinSpy (FinFisher) is installed on a computer system it can be remotely controlled and accessed as soon as it is connected to the internet/network, no matter where in the world the target system is based,” earlier documentation published by WikiLeaks said.

In NSW the police can apply for a special type of covert search warrant that would allow police to monitor computers remotely. The warrants are obtained from an “eligible judge” of the supreme court who is able to grant warrants.

The computer access possible under the program is extensive. In one communication with the software developers, a NSW police officer writes that there are risks that sensitive information – such as privileged communication with a lawyer – could be caught by the program.

Due to law restrictions on how certain information obtained … is it possible to implement a categorisation feature that can show categories for certain information ? For instance. A key logger captures information which is between a lawyer and a known criminal which is not an offense in itself.

“The captured information needs to be able to be identified as legal privilege and not used in any further intelligence capability as it is considered private. There are other categories that may come up so it would be useful if the categories could be implemented at the user level rather than hard coded by Gamma.”

NSW Greens MLC David Shoebridge said the WikiLeaks revelations were deeply concerning.

“Information that should be privileged, including communications with a lawyer or information that’s well beyond the scope of the warrant, is almost certainly being captured by this warrant. It looks as if the police don’t have systems to exclude it, and it’s deeply troubling.”

He also said the documents highlighted the need for a public interest monitor in NSW to ensure there was sufficient scrutiny over the warrant process.

“There’s obviously a significant flaw in a system which has no public scrutiny of it and nobody there testing the case, with the evidence only put forward by the police. It’s not the role of the judge to test the evidence, so the hearings just have one side of the argument put forward.

“We should have a public interest monitor that appears in these proceedings to do just that – to be there as an independent third party testing the police evidence, and that’s an office we’ve been calling for [for] some time now.”

Despite the substantial costs associated with the program, there appear to be no online tender records of Gamma International or any of its subsidiaries holding contracts with the NSW police.

A spokesman for NSW police said: “Given this technology relates to operational capability, it’s not appropriate to comment.”



WikiLeaks - FinSpy Brochure:

IP’s to Block:

Comments on 2014-09-15 Privacy

Last edit

No diff available.

0 Add Comment

2013-11-09 Privacy

Surveillance expert says Edward Snowden is ‘a true hero and a patriot’

Comments on 2013-11-09 Privacy

Last edit

No diff available.

0 Add Comment

2011-08-02 Privacy

We’re expecting that a version of the Internet Blacklist Bill will be introduced in the US House of Representatives in coming weeks, so we’ve pulled together this video to remind the world about what makes it so awful.

Share on Facebook Share on Twitter

Comments on 2011-08-02 Privacy

Last edit

No diff available.

0 Add Comment

2010-12-27 Privacy

Obtaining Online Privacy


Privoxy is a non-caching Web proxy with advanced filtering capabilities for enhancing privacy, modifying Web page data and HTTP headers before the page is rendered by the browser. Privoxy is a “privacy enhancing proxy”, filtering Web pages and removing advertisements. Privoxy can be customized by users, for both stand-alone systems and multi-user networks.

GNU License

Privoxy is based on the Internet Junkbuster and is released under the GNU General Public License. It runs on GNU/Linux, OpenWRT, Windows, Mac OS X, OS/2, AmigaOS, BeOS, and most flavors of Unix. Almost any Web browser can use it. The software is hosted at SourceForge.

Tor & Squid

Privoxy is frequently used in combination with Tor and Squid and can be used to bypass Internet censorship.

See: Privoxy.Org


Tor is a system intended to enable online anonymity, composed of client software and a network of servers which can hide information about users’ locations and other factors which might identify them. Use of this system makes it more difficult to trace internet traffic to the user, including visits to Web sites, online posts, instant messages, and other communication forms. It is intended to protect users’ personal freedom, privacy, and ability to conduct confidential business, by keeping their internet activities from being monitored. The software is open-source and the network is free of charge to use.

Though the name Tor originated as an acronym of The Onion Routing project, the current project no longer considers the name to be an acronym, and therefore does not use capital letters.

Tor is an implementation of onion routing, and works by relaying communications through a network of systems run by volunteers in various locations. Because the internet address of the sender and the recipient are not both readable at any step along the way (and in intermediate links in the chain, neither piece of information is readable), someone engaging in network traffic analysis and surveillance at any point along the line cannot directly identify which end system is communicating with which other. Furthermore, the recipient knows only the address of the last intermediate machine, not the sender. By keeping some of the network entry points hidden, Tor is also able to evade many internet censorship systems, even ones specifically targeting Tor.


An alpha version of the software, with the onion routing network “functional and deployed”, was announced on 20 September 2002. Roger Dingledine, Nick Mathewson, and Paul Syverson presented “Tor: The Second-Generation Onion Router” at the 13th USENIX Security Symposium on Friday, August 13, 2004.

Originally sponsored by the US Naval Research Laboratory, Tor was financially supported by the Electronic Frontier Foundation from 2004 to 2005. Tor software is now developed by the Tor Project, which since December 2006 is a 501(c)(3) research/education non-profit organization based in the United States of America that receives a diverse base of financial support.


Tor aims to conceal its users’ identity and their network activity from surveillance and traffic analysis. Operators of the system operate an overlay network of onion routers which provides anonymity in network location as well as anonymous hidden services. Tor employs encryption in a multi-layered manner (hence the original onion routing analogy) and ensures perfect forward secrecy between routers.

Originating traffic

Users of a Tor network run an onion proxy on their machine. The Tor software periodically negotiates a virtual circuit through the Tor network, using multi-layer encryption, ensuring perfect forward secrecy. At the same time, the onion proxy software presents a SOCKS interface to its clients. SOCKS-aware applications may be pointed at Tor, which then multiplexes the traffic through a Tor virtual circuit.

Once inside a Tor network, the traffic is sent from router to router, ultimately reaching an exit node at which point the cleartext packet is available and is forwarded on to its original destination. Viewed from the destination, the traffic appears to originate at the Tor exit node.

Tor’s application independence sets it apart from most other anonymity networks: it works at the Transmission Control Protocol (TCP) stream level. Applications whose traffic is commonly anonymised using Tor include Internet Relay Chat (IRC), instant messaging and World Wide Web browsing. When browsing the Web, Tor is often coupled with Polipo or Privoxy proxy servers. (Privoxy is a filtering proxy server that aims to add privacy at the application layer.)

On older versions of Tor (resolved May–July 2010), as with many anonymous web surfing systems, direct Domain Name System (DNS) requests are usually still performed by many applications, without using a Tor proxy. This allows someone monitoring a user’s connection to determine (for example) which WWW sites they are viewing using Tor, even though they cannot see the content being viewed. Using Privoxy or the command “torify” included with a Tor distribution is a possible solution to this problem. Additionally, applications using SOCKS5 – which supports name-based proxy requests – can route DNS requests through Tor, having lookups performed at the exit node and thus receiving the same anonymity as other Tor traffic.

As of Tor release, Tor includes its own DNS resolver which will dispatch queries over the mix network. This should close the DNS leak and can interact with Tor’s address mapping facilities to provide the Tor hidden service (.onion) access to non-SOCKS-aware applications.

Hidden services

Tor can also provide anonymity to servers in the form of location-hidden services, which are Tor clients or relays running specially configured server software. Rather than revealing the server’s IP address (and therefore its network location), hidden services are accessed through Tor-specific .onion pseudo top-level domain (TLD), or pseudomain. The Tor network understands this TLD and routes data anonymously both to and from the hidden service. Due to this lack of reliance on a public address, hidden services may be hosted behind firewalls or network address translators (NAT). A Tor client is necessary in order to access a hidden service.

Hidden services have been deployed on the Tor network beginning in 2004. Other than the database that stores the hidden-service descriptors, Tor is decentralized by design; there is no direct readable list of hidden services. There are a number of independent hidden services that serve this purpose.

Because location-hidden services do not use exit nodes, they are not subject to exit node eavesdropping. There are, however, a number of security issues involving Tor hidden services. For example, services that are reachable through Tor hidden services and the public Internet are susceptible to correlation attacks and thus not perfectly hidden. Other pitfalls include misconfigured services (e.g. identifying information included by default in web server error responses), uptime and downtime statistics, intersection attacks and user error.


Like all current low latency anonymity networks, Tor cannot and does not attempt to protect against monitoring of traffic at the boundaries of the Tor network, i.e., the traffic entering and exiting the network. While Tor does provide protection against traffic analysis, it cannot prevent traffic confirmation (also called end-to-end correlation).

Steven J. Murdoch and George Danezis from University of Cambridge presented an article[20] at the 2005 IEEE Symposium on security and privacy on traffic-analysis techniques that allow adversaries with only a partial view of the network to infer which nodes are being used to relay the anonymous streams. These techniques greatly reduce the anonymity provided by Tor. Murdoch and Danezis have also shown that otherwise unrelated streams can be linked back to the same initiator. However, this attack fails to reveal the identity of the original user. Murdoch has been working with, and funded by, Tor since 2006.

In September 2007, Dan Egerstad, a Swedish security consultant, revealed that he had intercepted usernames and passwords for a large number of email accounts by operating and monitoring Tor exit nodes. As Tor does not, and by design cannot, encrypt the traffic between an exit node and the target server, any exit node is in a position to capture any traffic passing through it which does not use end-to-end encryption such as TLS. While this may or may not inherently violate the anonymity of the source if users mistake Tor’s anonymity for end-to-end encryption they may be subject to additional risk of data interception by self-selected third parties. However, the operator of any network carrying unencrypted traffic, such as the operator of a wifi hotspot, has the same ability to intercept traffic as a Tor exit operator, so end-to-end encryption should always be used. Even without end-to-end encryption, Tor provides confidentiality against these local observers which may be more likely to have interest in the traffic of users on their network than arbitrary Tor exit operators.

Nonetheless, Tor and the alternative network system JonDonym (JAP) are considered more resilient than alternatives such as VPNs. Were a local observer on an ISP or WLAN to attempt to analyze the size and timing of the encrypted data stream going through the VPN, TOR or JonDo system, the latter two would be harder to analyze as demonstrated by a 2009 study.

Researchers from INRIA showed that Tor dissimulation technique in Bittorrent can be bypassed.


Because of its inherent anonymity, the traditional practices that network operators use to curb abuse may be insufficient with regard to connections coming from a Tor network. Tor has some features intended to reduce this problem, both from the perspective of exit node operators and third party sites.

Exit nodes each maintain an exit policy of what traffic is and is not permitted to leave Tor network through that node. It is possible to prevent most major abuses of Tor network using a combination of addresses and ports. Potential abuses include:

Bandwidth hogging

It is considered impolite by Tor community members to transfer massive amounts of data across the Tor network – the onion routers are run by volunteers using their own bandwidth at their own cost. Due to the high bandwidth usage caused by the peer-to-peer file sharing networks, it is considered impolite and inappropriate by Tor community members to utilize the Tor network for protocols like BitTorrent. By default, the Tor exit policy blocks the commonly used peer-to-peer ports.


The default Tor exit policy prevents connections to port 25 (SMTP), preventing people from sending spam directly from the Tor network. Anonymous users The Tor project attempts to ensure that websites that wish to set different access policies for users visiting through Tor can do so, providing various lists of Tor exit nodes. Wikipedia does this.

Uses that may be illegal in some nations

In some countries, Tor is used to circumvent laws against the criticism of heads of state, access censored information or to distribute copyrighted works and child pornography.


The main implementation of Tor is written in the C programming language and consists of roughly 146,000 lines of source code. Vuze (formerly Azureus), a BitTorrent client written in Java, includes built-in Tor support.

See: TorProject.Org


Comments on 2010-12-27 Privacy

Last edit

No diff available.

0 Add Comment

2010-11-01 Privacy

Internet Privacy

The right to privacy in Internet activity is a serious issue facing society.









In 2010, Various World Governments actively monitor all internet traffic. Using the broad umbrella of "searching for terrorism" or "National Security" most are involved in maintaining huge databases of internet activity. For this reason, I do not think that it is possible to be "anonymous" on the net anymore.

However, some users of the 'net wish to shield their identities while participating in frank discussions of sensitive topics. Others fulfill fantasies and harmlessly role play under the cover of a false identity in chatrooms, MUDs or the IRC. But there are the eternal "bad apples," and on the Internet, they are the people who use anonymous servers as more than a way to avoid responsibility for controversial remarks.

Cases of harassment and abuse have become increasingly frequent, aided by a cloak of anonymity.

There are also problems with frauds and scam artists who elude law enforcement authorities through anonymous mailings and postings. Other users are concerned about the proliferation of information on the Internet. Databases of court records are now available for free over the World Wide Web.

Since no formal law exists within cyberspace, Internet users can find recourse only through the applicable laws of their own government. This web page will not attempt to discuss conflict of law in the international arena, but will address US law and its relevance to the American Internet user.

Constitutional Basis for Privacy

Nowhere does the text of the United States Constitution contain the word "privacy." The US Supreme Court has found the concept of "privacy" to be protected by a number of the Amendments. Thus, privacy is known as a "penumbra right." It is the essence of the Bill of Rights and thus a guaranteed right.
Zoom in on Image

Invasion of Privacy

CCTV Cameras
This page will discuss five aspects of invasion of privacy as it applies to American netizens. These are: search and seizure, unsolicited e-mail, defamation, secrecy and the creation of databases consisting of personal information.

The US Supreme Court has stated that American citizens have the protection of the Fourth Amendment (freedom from search and seizure absent warrant) when there is a reasonable expectation of privacy. Without a reasonable expectation of privacy, however, there is no privacy right to protect. Files stored on disk or tape in the home are protected, but the rule becomes less clear when applied to files stored on an Internet access provider's server. Web servers, on the other hand, may be protected by federal law. Some argue that consent of the access provider, however, is all that is required for law enforcement authorities to search and seize any files in the possession of that access provider. Internet service providers may have a lot of information about the users because servers routinely record information about users' e-mail and web browsing habits.

Unsolicited email is not regulated by federal law at present. Various states have outlawed unsolicited commercial email, however. A federal judge in Philadelphia has ruled that companies have no First Amendment right to send unsolicited messages to online service subscribers. One community remedy preferred by many users is to complain to the offender's service provider and publicly denounce the action on the Internet Blacklist. Another possibility for those with Eudora for the PC is to filter out unsolicited email. Other free software claims to eliminate unwanted email. Individuals have sued spammers in court and won.

Individual states specifically prohibit defamation, no matter what form it takes. Defamation consists of false and unprivileged publication which results in economic damages. Financial loss is not necessary where the statement implies that a person is a criminal or has an unpleasant disease, or which injures a person in respect to his other office, profession, or business. A judge in Texas issued an injunction to stop defamatory posting by an Internet user. Hate messages sent by e-mail have also resulted in criminal penalties.

In some cases, companies have asked the court system to identify the authors of anonymous defamatory messages. This has been done by filing “John Doe” lawsuits and issuing subpoenas to Yahoo! and other message boards where individuals have posted the disparaging messages. Mosty such subpoenas go unchallenged. One CompuServe user has complained that his service provider turned over account information, including credit card numbers, without any notice to him. One such plaintiff, “Raytheon Corporation”, dropped its suit as soon as it discovered the names of the people posting anonymous messages.

Trade secrets and other confidential information can also pose legal problems. Unauthorized entry into a computer system is illegal, whether the target machine is connected to the Internet or not. Nevertheless, hackers still manage to get past the most difficult of firewalls. Compromise of company secrets can lead to millions of dollars in damages. Hacking is not the only danger to sensitive information, however. Some software can tell webmasters which visitors came from which links. In addition, all e-mail has an address attached. Even if the message content is encrypted, system administrators have access to the fact of communication between two parties.

The existence of communication can itself often be secret, and the Internet cannot provide absolute security. In many ways, the Internet abhors secrecy. Many netizens believe in an absolute free-flow of all information.

A few companies are creating huge databases full of private information. Websites may even collect email addresses inadvertently. In many cases, there is no prohibition on the dissemination of personal information. The federal government regulates only its own databases, leaving private database owners to decide how and when to distribute collected information.

Webmasters have begun using "cookies" as a means of accumulating information about web surfers without having to ask for it. Cookies attempt to keep track of visitors to a Web site. Criticism of cookies has included fear of the loss of privacy. The information that cookies collect from users may be profitable both in the aggregate and by the individual. Whether the convenience that cookies provide outweighs the loss of privacy is a question each Internet user must decide for him or herself.

USA: Supreme Court
America Online has been accused of selling data based information about users. This has led to an effort on the part of cookie proponents to control the amount of information that cookies collect. Not all proponents of cookies adhere to the voluntary standards. The Federal Trade Commission determined that Geocities, a onetime popular web site where users inputed personal information, was selling information in apparent violation of its own privacy policy.


Internet users sometimes do not realize the amount of privacy that is lost when accessing the online world. Usenet postings and contributions to bulletin boards may remain archived forever. Public records are available free or for a fee. While much of this information has been freely available in the past, the advent of the Internet has made it available more easily and quickly.


Generally, there is no safety in activity that is unlawful. Anonymity should not be considered a secure shield because, ultimately, any data may be intercepted. An offender is specifically liable for damages to the victim. At times, this liability can be extended to include others (e.g. an employer) with a greater capacity for satisfying the judgment.

In addition, there is current disagreement about whether a system operator or an access provider should be held liable.

The Telecommunications Act of 1996 protects access providers from liability for the unlawful acts of users (except copyright infringement or obscenity). Some Courts have protected ISPs and online services from vicarious liability for the acts of subscribers/customers.

On the other hand, vicarious liability, even without knowledge, may result when either the web page designer or the ISP:

  • has the right and ability to control the infringer’s acts.


  • receives a direct financial benefit from the infringement.


Micro Air Vehicle
The law surrounding issues of jurisdiction over individuals and businesses with an online presence continues to evolve. The first such case was U.S. v. Thomas. Courts in Minnesota, Arizona and Ohio have made similar determinations.

One California case is Hall v. LaRonde. In that opinion, the California Appellate Court stated:
* Creating a contract over email could warrant the exercise of jurisdiction.

Some argue that the nature of the Internet could result in forum shopping and the necessity of modifying traditional notions of venue requirements. Other courts have declined to exercise jurisdiction on the simple basis of accessibility of the Web page.

So the question of whether any individual court will decide to find jurisdiction on any specific set of facts is an issue for the judge hearing the case. The US Supreme Court will almost certainly lay down some guidelines within the next few years.


Avoiding the seizure of communication in transit is less a legal problem than a technological one. There is software that can provide privacy protection for the individual Internet user. Hardware exists that can prevent very sophisticated industrial espionage. Of most concern to the common Internet user is protection of email. The most famous encryption software is PGP,created by Phil Zimmerman. The U.S. government has attempted to suppress other means of strong encryption. These attempts have not always been successful, however.

Anonymous discourse is still available through the use of "remailers." Various problems with the forwarding service, a famous anonymous remailing service in Finland, have led to a discontinuation of that service. Other remailers continue to provide anonymity to e-mail and Usenet participants.

The key to anonymous publishing on the World Wide Web is to have as many web servers within the control and/or trust of the publisher. It also helpsto have a number of other servers performing a service to the community.Ian Goldburg and David Wagner at Berkeley are attempting to provide that service.

Recent versions of Netscape and Internet Explorer give web surfers the option of refusing to accept cookies. Windows users may automatically delete all cookies with every restart of the machine.

One Nation under CCTV

Internet Surveillance

Computer surveillance is the act of performing surveillance of computer activity, and of data stored on a hard drive or being transferred over the Internet.

Computer surveillance programs are widespread today, and almost all internet traffic is closely monitored for clues of illegal activity.

Supporters say that watching all internet traffic is important, because by knowing everything that everyone is reading and writing, they can identify terrorists and criminals, and protect society from them.

Critics cite concerns over privacy and the possibility of a totalitarian state where political dissent is impossible and opponents of state policy are removed in COINTELPRO-like purges. Such a state may be referred to as an Electronic Police State, in which the government aggressively uses electronic technologies to record, organize, search and distribute forensic evidence against its citizens.


The right against unsanctioned invasion of privacy by the government, corporations or individuals is part of many countries’ privacy laws, and in some cases, constitutions. Almost all countries have laws which in some way limit privacy; an example of this would be law concerning taxation, which normally require the sharing of information about personal income or earnings.

In some countries individual privacy may conflict with freedom of speech laws and some laws may require public disclosure of information which would be considered private in other countries and cultures. Privacy may be voluntarily sacrificed, normally in exchange for perceived benefits and very often with specific dangers and losses, although this is a very strategic view of human relationships.

Academics who are economists, evolutionary theorists, and research psychologists describe revealing privacy as a ‘voluntary sacrifice’, where sweepstakes or competitions are involved. In the business world, a person may give personal details (often for advertising purposes) in order to enter a gamble of winning a prize. Information which is voluntarily shared and is later stolen or misused can lead to identity theft.

The concept of privacy is most often associated with Western culture, English and North American in particular. According to some researchers, the concept of privacy sets Anglo-American culture apart even from other Western European cultures such as French or Italian. The concept is not universal and remained virtually unknown in some cultures until recent times.


Before attempting anonymity on the Internet, it is best to think for a moment about your purpose and intent in hiding your identity. Think as well about the impact of the statement you wish to communicate. The safest course may be to acknowledge authorship. This avoids the uncertainties of liability and discovery.


Comments on 2010-11-01 Privacy

Last edit

No diff available.

0 Add Comment

2010-07-21 Privacy


Use of closed circuit television cameras by government agencies (police, hospitals, schools, rail and road authorities) and private bodies (retailers, taxi operators, private security services) continues to increase. In the UK it has been claimed that the average citizen is captured by 300 cameras each day, that there were around 1.5 million cameras in 400 communities as of 2002 (with 40,000 operated by local government, up from 100 in 1990), and that distinctions between private and public space are eroding.

Most developed countries, Australia included, are witnessing increased government and public concerns about crime and security. Amid these anxieties, closed circuit television (CCTV) systems to monitor public spaces are increasingly being touted as a solution to problems of crime and disorder. The city of Perth established Australia’s first open street closed circuit television system in July 1991. Subsequently, there has been significant expansion.

At the end of 2002 Australia had 33 “open street” CCTV schemes. Based on site inspections, extensive reviews of documentation and interviews with 22 Australian administrators, this article discusses issues relating to system implementation, management and accountability. We also suggest ways relevant authorities might ensure that current and future schemes are appropriately audited and evaluated. We argue that rigorous independent assessment of both the intended and unintended consequences of open street CCTV is essential to ensure this measure is not deployed inappropriately. Finally, this article suggests any potential crime prevention benefits must be carefully weighed against the potential of CCTV to exacerbate social division and exclusion.

There is no consensus on the effectiveness of public CCTV as a deterrent or an effective mechanism for responding to crime, although there are strong suggestions that the technological fix is overrated and oversold. Organisations responding to the “need to be doing something” are susceptible to spending money on equipment acquisition and deployment without appropriate investment in ongoing monitoring.

In practice the value of CCTV is often forensic - as a tool for identifying what happened - rather preventive, something that is unsurprising as some images are not closely monitored (“no one is actually watching what’s seen by the eye in the sky”), image quality is poor or devices are not working, and help is not readily at hand if the observer does identify an incident.

A Shoalhaven (NSW) resident has started a petition calling on the council to remove CCTV cameras from Nowra’s CBD. The council is currently installing 18 day and night cameras, but Adam Bonner says the cameras are an invasion of privacy. “I think there’s a wide opportunity there for abuse,” he said. In response Mayor Paul Green says people who are doing the right thing should not fear the cameras.

Canadian Privacy Comissioner: “If we cannot walk or drive down the street without being systematically monitored by the cameras of the state, our lives and our society will be irretrievably altered. The psychological impact of having to live in a sense of constantly being observed must surely be enormous, indeed incalculable. We will have to adapt, and adapt we undoubtedly will. But something profoundly precious - our right to feel anonymous and private as we go about our day-to-day lives will have been lost forever.”

Several commentators (e.g., Bannister, Fyfe, & Kearns, 1998; Dees, 2000; Fyfe & Bannister, 1996) link the rise of CCTV to the tendency for urban centres to be transformed from sites of production to sites of consumerism and consumption. They argue that populations are being divided into competent and “flawed” consumers--the latter lacking resources to participate in a consumerist economy.

The globalising of commerce has also led to a commodification of individual town and city centres. Such centres are increasingly image conscious and CCTV has played an important part in marketing public areas to tourists, other consumers and investors as “risk-free” (McCahill, 2002, p. 12). Intertwined with the reshaping of urban images has been the rise of an exclusionary impulse--a desire to rid public spaces of flawed consumers . As McCahill suggests, “the visibility of unemployed or homeless people on the streets or hanging around in shopping centres.

To some, it’s comforting to imagine vigilant police monitoring every camera, but the truth is very different. Most CCTV footage is never looked at until well after a crime is committed. When it is examined, it’s very common for the viewers not to identify suspects. Lighting is bad and images are grainy, and criminals tend not to stare helpfully at the lens. Cameras break far too often. The best camera systems can still be thwarted by sunglasses or hats. Even when they afford quick identification — think of the 2005 London transport bombers and the 9/11 terrorists — police are often able to identify suspects without the cameras. Cameras afford a false sense of security, encouraging laziness when we need police to be vigilant.

The solution isn’t for police to watch the cameras. Unlike an officer walking the street, cameras only look in particular directions at particular locations. Criminals know this, and can easily adapt by moving their crimes to someplace not watched by a camera — and there will always be such places. Additionally, while a police officer on the street can respond to a crime in progress, the same officer in front of a CCTV screen can only dispatch another officer to arrive much later. By their very nature, cameras result in underused and misallocated police resources.



Comments on 2010-07-21 Privacy

Last edit

No diff available.

0 Add Comment

Bookmark this on Delicious

SEO-AU Links Best INFP Websites - Click here to Vote for this site!