Skip to main content

How to check individual cookies in IE, Firefox and Chrome

How often has this happened with you that you visited a site and after that which ever site you went to, you started seeing the ads of the site you visited?

If you know anything about web and browsers you know it is the cookies that were planted by the site. Cookies are nothing but small text files containing some information about the client machine. Often harmless and mostly used by sites to give you a tailor-made experience whenever you visit the site next, but very annoying in some cases like the one I just mentioned.

Browsers give you facility to clear either the entire browsing history (which includes visited sites, cookies, cache, form data etc.) or individual items and you can delete all the cookies in one go. But if you want to see which sites have planted cookies in your machine and clear only the ones you want to, read on.

Chrome: 

Click on the 'Wrench Menu'.
Click on 'Show Advanced Settings'.
Under 'Privacy', click on 'Content Settings' button.
In the 'Content Settings' window that opens, under 'Cookies' header you can see all the options related to cookies. Click on 'All cookies and site data' button and you can see all the sites which have planted cookies. You can even check individual cookies planted by each site.
You can delete any or all the cookies in the list.



Firefox:

Go to Tools>Options>Privacy tab.
In the 'History' section, you can see the links to 'clear your recent history' and 'remove individual cookies'. The latter link opens a window which lists all the cookies planted by all the sites. Once again you can delete any or all the cookies in the list.


IE 9:

The cookies for IE are stored in this folder: C:\Users\username>\AppData\Roaming\Microsoft\Windows\Cookies\Low and you can delete either contents of this  folder or by doing Ctrl+Shift+Del and selecting cookies in the list of objects to be deleted.

However there is no simple way to delete individual cookies here. You can delete cookies related to particular domain though. Open the site in the browser. Press F12 to open 'Developer Tools'. Click on 'Cache' in the menu to see the various options related to cookie handling. You can see the cookies planted by this site and delete them.



So there you have it. That's the way the cookie crumbles!


Comments

Ami Jha said…
Awesome !!!

Popular posts from this blog

How to upload to Google Cloud Storage buckets using CURL

Signed URLs are pretty nifty feature given by Google Cloud Platform to let anyone access your cloud storage (bucket or any file in the bucket) without need to sign in. Official documentation gives step by step details as to how to read/write to the bucket using gsutil or through a program. This article will tell you how to upload a file to the bucket using curl so that any client which doesn't have cloud SDK installed can do this using a simple script. This command creates a signed PUT URL for your bucket. gsutil signurl -c 'text/plain' -m PUT serviceAccount.json gs://test_bucket_location Here is my URL: https://storage.googleapis.com/test_sl?GoogleAccessId=my-project-id@appspot.gserviceaccount.com&Expires=1490266627&Signature=UfKBNHWtjLKSBEcUQUKDeQtSQV6YCleE9hGG%2BCxVEjDOmkDxwkC%2BPtEg63pjDBHyKhVOnhspP1%2FAVSr%2B%2Fty8Ps7MSQ0lM2YHkbPeqjTiUcAfsbdcuXUMbe3p8FysRUFMe2dSikehBJWtbYtjb%2BNCw3L09c7fLFyAoJafIcnoIz7iJGP%2Br6gAUkSnZXgbVjr6wjN%2FIaudXIqA...

Running Apache Beam pipeline using Spark Runner on a local standalone Spark Cluster

The best thing about Apache Beam ( B atch + Str eam ) is that multiple runners can be plugged in and same pipeline can be run using Spark, Flink or Google Cloud Dataflow. If you are a beginner like me and want to run a simple pipeline using Spark Runner then whole setup may be tad daunting. Start with Beam's WordCount examples  which help you quickstart with running pipelines using different types of runners. There are code snippets for running the same pipeline using different types of runners but here the code is running on your local system using Spark libraries which is good for testing and debugging pipeline. If you want to run the pipeline on a Spark cluster you need to do a little more work! Let's start by setting up a simple standalone single-node cluster on our local machine. Extending the cluster is as easy as running a command on another machine, which you want to add to cluster. Start with the obvious: install spark on your machine! (Remember to have Java a...

Example of Using SimpleHttpOperator to make POST call

Airflow has SimpleHttpOperator which can be used to invoke REST APIs. However using this operator is not exactly straightforward. Airflow needs to be told about the connection parameters and all the other information that is needed to connect to external system. For this we need to create Connections. Open 'Connections' page through Admin->Connections link.  Expand the dropdown to see the various types of connection options available. For a REST call, create an HTTP connection. Give the host URL and any other details if required. Now when we write our task using SimpleHttpOperator we will need to refer to the connection that was just created. The task below is making a post call to  https://reqres.in/api/users  API and passing it some data in JSON format. myHttpTask = SimpleHttpOperator(  task_id='get_op',  method='POST',  http_conn_id='dcro',  data=json.dumps({    "name":"Morpheus",    " job ":" L...