Tuesday, February 25, 2014

Featured Paper: Bridging the Cellular Divide: A Search for Consensus Regarding Law Enforcement Access to Historical Cell Data

From the February 2014 Issue of the Cardozo Law Review:

Zachary Ross, Bridging the Cellular Divide: A Search for Consensus Regarding Law Enforcement Access to Historical Cell Data.

Excerpt:
Technological change is often a double-edged sword--it enables and enriches our lives, but also allows for new means of exploitation and control. As social, architectural, and market barriers protecting longstanding notions of personal space erode, individuals increasingly rely on the legal system as a defense to arbitrary invasions of privacy. Paradoxically, the same forces that make the need for robust privacy protections more compelling also make the existing legal framework outdated and inapposite. 
These contradictions are readily apparent in the contemporary debate over the legal restrictions on government access to cell site location information (CSLI). This data, constantly collected by cell phone service providers (CSPs) in order to manage their networks, has the potential to provide a detailed map of an individual cell user's movements from place to place over extended periods of time. Furthermore, the quantity and precision of location data collected by CSPs is constantly increasing, becoming more revealing, and more valuable to law enforcement in the process. Despite the potential intimacy of this data and its growing relevance to criminal investigations, the legal protection afforded CSLI is hotly disputed, and at present varies greatly among (and sometimes even within) jurisdictions-- with courts sometimes requiring a warrant, and sometimes allowing unfettered access upon a lesser evidentiary showing. This lack of uniformity has been exacerbated by a recent Fifth Circuit ruling on government access to CSLI, which generated a different rule than had previously been adopted by the Third Circuit. The vastly disparate treatment of government requests for CSLI has created a chaotic system ripe for abuse, and all but guaranteed Supreme Court review of the issue in the near future, as the Court itself seems to have implicitly acknowledged. 
This Note will examine the complex interaction between privacy, surveillance, and technology through an exploration of the contested legal terrain governing law enforcement access to historical CSLI--location data recorded by CSPs which reveal an individual's past movements. 

Monday, February 10, 2014

Personal Data Protection and Breach Accountability Act of 2014 would enact criminal penalties for "intentionally or willfully" concealing a security breach

Thanks in part to the recent security breaches at Target and Neiman Marcus, pressure for a federal response to data security has become increasingly popular. Numerous bills have been introduced in the House and the Senate that call for new legislative enactments to answer the data security problem.

A somewhat popular proposal for many of these bills is a new criminal statute for individuals who knowingly and willingly fail to report a known security breach. I recently introduced readers to Senator Patrick Leahy’s Personal Data Privacy and Security Act of 2014, and detailed some of the bill's criminal proposals, including numerous amendments to the Computer Fraud and Abuse Act. The bill also included a proposed criminal statute that would read,
Whoever, having knowledge of a security breach and of the fact that notice of such security breach is required under title II of the Personal Data Privacy and Security Act of 2014, intentionally and willfully conceals the fact of such security breach, shall, in the event that such security breach results in economic harm to any individual in the amount of $1,000 or more, be fined under this tile [sic] or imprisoned for not more than 5 years, or both. 
Last Tuesday, ahead of a Senate Judiciary Committee hearing addressing the Target and Neiman Marcus data breaches, Senator Richard Blumenthal and Senator Ed Markey introduced the Personal Data Protection and Breach Accountability Act of 2014. According to a recent press release, Senator Blumenthal stated that the bill “will give consumers much stronger, industry-wide protections against massive thefts of private financial information” and that “[s]tiffer enforcement with stringent penalties are vital to assure that retailers use state of the art safeguards.” Similar to Senator Leahy’s bill, the Personal Data Protection and Breach Accountability Act of 2014 would include a new criminal statute that would read, 
Whoever, having knowledge of a security breach and of the fact that notice of such security breach is required under title II of the Personal Data Protection and Breach Accountability Act of 2014, intentionally or willfully conceals the fact of such security breach and which breach, shall, in the event that such security breach results in economic harm or substantial emotional distress to 1 or more persons, shall be fined under this title or imprisoned not more than 5 years, or both.
A notable difference between these two proposals is the Personal Data Protection and Breach Accountability Act’s requirement that the breach “results in economic harm or substantial emotional distress to 1 or more persons.” In my eyes, this would encompass significantly more security breaches than in Senator Leahy's already broad proposal.

In a recent op-ed for the International Association of Privacy Professional’s online publication, Privacy Perspectives, I question whether criminal liability for failing to disclose a data security breach would be a prudent move, focusing specifically on Senator Leahy’s bill. My concerns would extend to this new proposal as well.

It will be interesting to see, with such an outcry for a federal response, what (if anything) will be adopted, and whether some variation of these "criminal concealment of a known security breach" proposals will be included.

Thursday, February 6, 2014

Quick note: Oklahoma Appellate Court: No reasonable expectation of privacy in text messages sent to another person's phone

The case is State v. Marcum, No. S-2012-976 (OK App. Ct. Jan. 28, 2014). FourthAmendment.com's summary of the case (with slight modifications):
A search warrant was used to obtain [cell phone text records]. [The court held] that [t]here is no reasonable expectation of privacy in the records of another person’s account <even if those records pertain to you>. When you hit “send” and [transmit a] text message to another person, you’ve lost any reasonable expectation of privacy in the message.
The court's holding can be boiled down to the following paragraph:
Addressing only the narrow question before us, Marcum has not demonstrated a reasonable expectation of privacy in the records seized from U.S. Cellular for Miller's phone account. This Court adopts the reasoning of the courts which have concluded that there is no expectation of privacy in the text messages or account records of another person, where the defendant has no possessory interest in the cell phone in question, and particularly where, as here, the actual warrant is directed to a third party. 
The court frames it as a novel issue, but I think the third-party doctrine compels a rather straightforward outcome.

Wis. Sup. Ct. 4th Amendment case: if a probation condition bans possessing a PC (i.e. contraband), you can seize it, but can you search it?

The Wisconsin Supreme Court heard oral arguments today in State v. Purtell, 2012AP001307 - CR (Wis. Sup. Ct. 2014) (link to PDF of docket). A summary from the Wisconsin Public Defender's "On Point site" gives a good synopsis:
Purtell was on probation for animal cruelty convictions, and as a condition of probation was allowed access to computers only for school or work. After Purtell admitted having a laptop at home, his agent went to his home and removed the laptop. She found files showing females, some appearing to be very young, engaged in sexual acts with animals; after a warrant to search the computer was obtained based on that information, police found child pornography. The sole issue on appeal was whether the agent had reasonable suspicion to search Purtell’s computer for “contraband,” which the state argued included images of animal cruelty. The court of appeals held there was no reasonable suspicion, first because Purtell’s conditions of probation didn’t expressly prohibit him from possessing such images, and, second, because the state pointed to no reasonable grounds to believe there was some other kind of contraband on the laptop, but relied only on “generally suspicious” behavior. 
The oral argument can be found in Wisconsin Supreme Court oral argument archive, or click here for a directly link to the streaming wma file.

The Supreme Court's summary of the case can be found in its February oral argument preview. The State (Petitioner), frames the issue in this manner:
The content of Purtell’s computer, like the computer itself, was contraband regardless of whether Purtell’s probation included a rule or condition prohibiting the possession of images depicting cruelty to animals.
Purtell (Respondent), frames it like this:
The Images Retrieved from Mr. Purtell’s Computer Were Inadmissible Because the Probation Agent Did Not Have Reasonable Grounds to Believe the Computer Contained Contraband. 
... 
Courts must separately analyze the reasonableness of a search for a computer and a search of the contents of a computer.
The State's Reply Brief can be found here.

The appellate court, which reversed and remanded the trial court, gives the following background synopsis:
Purtell was convicted of two counts of cruelty to animals, one resulting in the death of the animal, and he was placed on probation.  One condition of Purtell’s probation was that he not own or possess a computer and that he could only use a computer “at his place of business or school.”  The purpose of this prohibition may have been to limit Purtell’s access to certain types of images, but the conditions of his probation did not actually impose a limitation on the types of images or written materials Purtell could possess. 
At a meeting with his probation agent, Purtell complained about the no-computer condition.  Purtell told the agent that he had a working laptop and a desktop computer that did not work, both at his residence.  Purtell also told the agent that he had a Myspace account and gave the agent his Myspace password. 
For reasons that do not matter for purposes of this appeal, Purtell’s agent subsequently went to Purtell’s residence and removed his laptop and desktop computers.  The seizure of Purtell’s computers is not challenged.  Later, at her office, the agent looked at the contents of one of Purtell’s computers. The agent “clicked on files” and observed that titles of the files did not always match the images that were in the files.  The agent located files showing females engaged in sexual acts with animals.  The agent later testified:  “[A] number of the files, when we opened them, had names of like very young females.  [And there was] concern at some point that this was sex involving underage females.” 
Based on information that Purtell’s agent gained from looking at the contents of Purtell’s computers, law enforcement subsequently obtained a warrant to search the computers. The resulting further search revealed a large volume of still images and “videos” depicting young children engaged in sex acts.   
Purtell was charged with eight counts of possession of child pornography.  He moved to suppress the evidence resulting from the search of his computers, arguing that his probation agent performed an illegal warrantless search.  At a hearing on this suppression motion, Purtell’s probation agent testified that, prior to searching the contents of one of Purtell’s computers at her office, she looked at Purtell’s Myspace account.  On that account, she saw pictures of “animals that were partially human,” such as a “woman that was half woman and half a cow.”  The agent testified that, based on what she saw on Purtell’s Myspace account, she thought Purtell’s computers might have “files regarding cruelty to animals or death and mutilation of animals.”  She was concerned about Purtell’s mental health issues. 
After hearing testimony and viewing evidence, the circuit court denied Purtell’s suppression motion.  The court concluded that the agent had “legitimate reasons of probation supervision to view the [contents of the] computers.”  The court stated that the images the agent saw on Purtell’s Myspace account gave the agent reason to believe that there was contraband on Purtell’s computers.
The substance of the appellate court's decision:
As Purtell makes clear, he does not challenge the search of his residence or the seizure of his computers.  Rather, he challenges the search of the contents of his computers.  Indeed, the State and Purtell agree that the issue here is whether Purtell’s probation agent had “reasonable grounds” to believe that Purtell’s computers contained “contraband.”  The parties further agree that “contraband,” for purposes of this case, means any item that Purtell was not allowed to possess under the conditions of his supervision or any item whose possession is forbidden by law. 
So far as we can tell, the State’s sole argument on appeal is that, based on several pieces of information, Purtell’s probation agent had reasonable grounds to believe that Purtell’s computers contained images depicting cruelty to animals or the mutilation of animals, and that such images were “contraband.” However, even if we were persuaded that there were reasonable grounds to believe that Purtell’s computers contained images depicting cruelty to animals or the mutilation of animals, the State fails to demonstrate that such images are “contraband.”

…before this court and the circuit court, the State simply pointed to behavior that was generally suspicious, such as the fact that Purtell possessed the computers at home in violation of the conditions of his probation and Purtell’s failure to attend a scheduled mental health treatment appointment.  These and other factors may have justified the probation agent taking some action, but they do not supply “reasonable grounds” to believe that Purtell’s computers contained contraband.  As we have explained, the State’s argument in this regard appears to be based on the faulty assumption that Purtell’s probation conditions prohibited him from possessing images depicting cruelty to animals or the mutilation of animals.  Having rejected that assumption, the State’s arguments leave us with no basis to affirm the circuit court’s denial of Purtell’s suppression motion.  

Tuesday, February 4, 2014

Massive round-up of new law articles, covering privacy, Fourth Amendment, GPS, cell site, cybercrime, big data, revenge porn, drones, and more

This Article examines a question that has become increasingly important in the emerging surveillance society: Should the law treat information as private even though others know about it? This is the third-party privacy problem. Part II explores two competing conceptions of privacy — the binary and contextual conceptions. Part III describes two features of the emerging surveillance society that should change the way we address the third-party privacy problem. One feature, “surveillance on demand,” results from exponential increases in data collection and aggregation. The other feature, “uploaded lives,” reflects a revolution in the type and amount of information that we share digitally. Part IV argues that the binary conception cannot protect privacy in the surveillance society because it fails to account for the new realities of surveillance on demand and uploaded lives. Finally, Part V illustrates how courts and legislators can implement the contextual conception to deal with two emerging surveillance society problems — facial recognition technology and geolocation data.

Privacy laws rely on the unexamined assumption that the collection of data is not speech. That assumption is incorrect. Privacy scholars, recognizing an imminent clash between this long-held assumption and First Amendment protections of information, argue that data is different from the sort of speech the Constitution intended to protect. But they fail to articulate a meaningful distinction between data and other more traditional forms of expression. Meanwhile, First Amendment scholars have not paid sufficient attention to new technologies that automatically capture data. These technologies reopen challenging questions about what “speech” is. 
This Article makes two overdue contributions to the First Amendment literature. First, it argues that when the scope of First Amendment coverage is ambiguous, courts should analyze the government’s motive for regulating. Second, it highlights and strengthens the strands of First Amendment theory that protect the right to create knowledge. Whenever the state regulates in order to interfere with the creation of knowledge, that regulation should draw First Amendment scrutiny. 
In combination, these claims show clearly why data must receive First Amendment protection. When the collection or distribution of data troubles lawmakers, it does so because data has the potential to inform and to inspire new opinions. Data privacy laws regulate minds, not technology. Thus, for all practical purposes, and in every context relevant to privacy debates, data is speech.
The police tend to think that those who evade surveillance are criminals. Yet the evasion may only be a protest against the surveillance itself. Faced with the growing surveillance capacities of the government, some people object. They buy “burners” (prepaid phones) or “freedom phones” from Asia that have had all tracking devices removed, or they hide their smartphones in ad hoc Faraday cages that block their signals. They use Tor to surf the internet. They identify tracking devices with GPS detectors. They avoid credit cards and choose cash, prepaid debit cards, or bitcoins. They burn their garbage. At the extreme end, some “live off the grid” and cut off all contact with the modern world. 
These are all examples of what I call privacy protests: actions individuals take to block or to thwart government surveillance for reasons unrelated to criminal wrongdoing. Those engaged in privacy protests do so primarily because they object to the presence of perceived or potential government surveillance in their lives. How do we tell the difference between privacy protests and criminal evasions, and why does it matter? Surprisingly scant attention has been given to these questions, in part because Fourth Amendment law makes little distinction between ordinary criminal evasions and privacy protests. This Article discusses the importance of these ordinary acts of resistance, their place in constitutional criminal procedure, and their potential social value in the struggle over the meaning of privacy.
Conor M. Reardon, Cell Phones, Police Recording, and the Intersection of the First and Fourth Amendments, 63 Duke Law Journal 735-779 (2013). Abstract:
In a recent spate of highly publicized incidents, citizens have used cell phones equipped with video cameras to record violent arrests. Oftentimes they post their recordings on the Internet for public examination. As the courts have recognized, this behavior lies close to the heart of the First Amendment. 
But the Constitution imperfectly protects this new form of government monitoring. Fourth Amendment doctrine generally permits the warrantless seizure of cell phones used to record violent arrests, on the theory that the recording contains evidence of a crime. The Fourth Amendment inquiry does not evaluate a seizing officer’s state of mind, permitting an official to seize a video for the very purpose of suppressing its contents. Moreover, Supreme Court precedent is typically read to ignore First Amendment interests implicated by searches and seizures. 
This result is perverse. Courts evaluating these seizures should stop to recall the Fourth Amendment’s origins as a procedural safeguard for expressive interests. They should remember, too, the Supreme Court’s jurisprudence surrounding seizures of obscene materials—an area in which the Court carefully shaped Fourth Amendment doctrine to protect First Amendment values. Otherwise reasonable seizures can become unreasonable when they threaten free expression, and seizures of cell phones used to record violent arrests are of that stripe. Courts should therefore disallow this breed of seizure, trusting the political branches to craft a substitute procedure that will protect law-enforcement interests without doing violence to First Amendment freedoms.
Elizabeth Friedler, Protecting the Innocent—the Need to Adapt Federal Asset Forfeiture Laws to Protect the Interests of Third Parties in Digital Asset Seizures, Cardozo Arts & Entertainment Law Journal, Volume 32, Issue 1 (2013).

Jana Sutton, Of Information, Trust, and Ice Cream: A Recipe for a Different Perspective on the Privacy of Health Information, 55 Ariz. L. Rev. 1171 (2014). Abstract:
The concept of privacy is inescapable in modern society. As technology develops rapidly and online connections become an integral part of our daily routines, the lines between what may or may not be acceptable continue to blur. Individual autonomy is important. We cannot, however, allow it to suffocate the advancement of technology in such vital areas as public health. Although this Note cannot lay out detailed instructions to balance the desire for autonomy and the benefits of free information, it attempts to provide some perspective on whether we are anywhere close to striking the right balance. When the benefits of health information technology are so glaring, and yet its progress has been so stifled, perhaps we have placed far too much value—at least in the health care context—on individual privacy.
Kevin S. Bankston & Ashkan Soltani, Tiny Constables and the Cost of Surveillance: Making Cents Out of United States v. Jones, 123 YALE L.J. ONLINE 335 (2014). Abstract:
In United States v. Jones, five Supreme Court Justices wrote that government surveillance of one’s public movements for twenty-eight days using a GPS device violated a reasonable expectation of privacy and constituted a Fourth Amendment search. Unfortunately, they didn’t provide a clear and administrable rule that could be applied in other government surveillance cases. In this Essay, Kevin Bankston and Ashkan Soltani draw together threads from the Jones concurrences and existing legal scholarship and combine them with data about the costs of different location tracking techniques to articulate a cost-based conception of the expectation of privacy that both supports and is supported by the concurring opinions in Jones.
Schmitt, Michael N. and Vihul, Liis, The International Law of Attribution During Proxy 'Wars' in Cyberspace (January 30, 2014). 1 Fletcher Security Review (2014 Forthcoming). Abstract:
The article examines the use of non-State actors by States to conduct cyber operations against other States. In doing so, it examines attribution of a non-State actor's cyber operations to a State pursuant to the law of State responsibility, attribution of a non-State actor's cyber armed attack to a State for the purposes of a self-defense analysis, and attribution of cyber military operations to a State in the context of determining whether an international armed conflict has been initiated. These three very different legal inquiries are often confused with each other. The article seeks to deconstruct the issue of attribution into its various normative components.
Kate Crawford & Jason Schultz, Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms, 55 B.C. L. Rev. 93 (2014). Abstract:
The rise of “Big Data” analytics in the private sector poses new challenges for privacy advocates. Through its reliance on existing data and predictive analysis to create detailed individual profiles, Big Data has exploded the scope of personally identifiable information (“PII”). It has also effectively marginalized regulatory schema by evading current privacy protections with its novel methodology. Furthermore, poor execution of Big Data methodology may create additional harms by rendering inaccurate profiles that nonetheless impact an individual’s life and livelihood. To respond to Big Data’s evolving practices, this Article examines several existing privacy regimes and explains why these approaches inadequately address current Big Data challenges. This Article then proposes a new approach to mitigating predictive privacy harms—that of a right to procedural data due process. Although current privacy regimes offer limited nominal due process-like mechanisms, a more rigorous framework is needed to address their shortcomings. By examining due process’s role in the Anglo-American legal system and building on previous scholarship about due process for public administrative computer systems, this Article argues that individuals affected by Big Data should have similar rights to those in the legal system with respect to how their personal data is used in such adjudications. Using these principles, this Article analogizes a system of regulation that would provide such rights against private Big Data actors.
Larkin, Paul J., 'Revenge Porn,' State Law, and Free Speech (January 14, 2014).  Abstract:
For most of our history, only celebrities — presidents, movie stars, professional athletes, and the like — were at risk of having their everyday exploits and activities photographed and shown to the world. But that day is gone. Today, we all face the risk of being made into a celebrity due to the ubiquity of camera-equipped cell phones and the ease of uploading photographs or videos onto the Internet. But a particularly troubling aspect of this phenomenon goes by the name of "revenge porn" — that is, the Internet posting of photographs of naked former wives and girlfriends, sometimes in intimate positions or activities. Revenge porn is an example of malicious conduct that injures the welfare of someone who mistakenly trusted an intimate partner. Tort law traditionally has allowed parties to recover damages for such violations of privacy, and criminal law also can prohibit such conduct, but there are several First Amendment defenses that the responsible parties can assert to fend off liability. This article argues that allowing a victim of revenge porn to recover damages for publication that breaches an implicit promise of confidentiality is faithful to tort and criminal law principles and will not punish or chill the legitimate expression of free speech.
Jonathan Olivito, Beyond the Fourth Amendment: Limiting Drone Surveillance Through the Constitutional Right to Informational Privacy, 74 Ohio St. L.J. 669 (2013). 

The entirety of Volume 74, Issue 6 in the Ohio State Law Journal; Symposium: The Second Wave of Global Privacy Protection (Titles Below)
Peter Swire, The Second Wave of Global Privacy Protection: Symposium Introduction, 74 Ohio St. L.J. 841 (2013). 
Ann Bartow, Privacy Laws and Privacy Levers: Online Surveillance Versus Economic Development in the People’s Republic of China, 74 Ohio St. L.J. 853 (2013). 
Andrew Clearwater & J. Trevor Hughes, In the Beginning . . . An Early History of the Privacy Profession, 74 Ohio St. L.J. 897 (2013). 
Claudia Diaz, Omer Tene & Seda Gürses, Hero or Villain: The Data Controller in Privacy Law and Technologies, 74 Ohio St. L.J. 923 (2013). 
A. Michael Froomkin, “PETs Must Be on a Leash”: How U.S. Law (and Industry Practice) Often Undermines and Even Forbids Valuable Privacy Enhancing Technology, 74 Ohio St. L.J. 965 (2013). 
Woodrow Hartzog, Social Data, 74 Ohio St. L.J. 995 (2013). 
Dennis D. Hirsch, In Search of the Holy Grail: Achieving Global Privacy Rules Through Sector-Based Codes of Conduct, 74 Ohio St. L.J. 1029 (2013). 
Gus Hosein & Caroline Wilson Palow, Modern Safeguards for Modern Surveillance: An Analysis of Innovations in Communications Surveillance Techniques, 74 Ohio St. L.J. 1071 (2013). 
Anil Kalhan, Immigration Policing and Federalism Through the Lens of Technology, Surveillance, and Privacy, 74 Ohio St. L.J. 1105 (2013). 
Bartosz M. Marcinkowski, Privacy Paradox(es): In Search of a Transatlantic Data Protection Standard, 74 Ohio St. L.J. 1167 (2013). 
Thomas Margoni & Mark Perry, Deep Pockets, Packets, and Harbors, 74 Ohio St. L.J. 1195 (2013). 
Omer Tene, Privacy Law’s Midlife Crisis: A Critical Assessment of the Second Wave of Global Privacy Laws, 74 Ohio St. L.J. 1217 (2013). 
Yofi Tirosh & Michael Birnhack, Naked in Front of the Machine: Does Airport Scanning Violate Privacy? 74 Ohio St. L.J. 1263 (2013). 
Yang Wang, Pedro Giovanni Leon, Xiaoxuan Chen, Saranga Komanduri, Gregory Norcie, Kevin Scott, Alessandro Acquisti, Lorrie Faith Cranor & Norman Sadeh, From Facebook Regrets to Facebook Privacy Nudges, 74 Ohio St. L.J. 1307 (2013). 
Tal Z. Zarsky & Norberto Nuno Gomes de Andrade, Regulating Electronic Identity Intermediaries: The “Soft eID” Conundrum, 74 Ohio St. L.J. 1335 (2013).
The entirety of Volume 14, Issue 1 of the  Journal of High Technology Law (2014) (Titles Below).
After Jones, The Deluge: The Fourth Amendment's Treatment Of Information, Big Data And The Cloud , Lon A. Berk, 14 J. High Tech L. 1 (2014). 
The Legislative Response To Employers' Requests For Password Disclosure, Jordan M. Blanke, 14 J. High Tech L. 42 (2014). 
A Shot In The Dark: An Analysis Of The SEC's Response To The Rise Of Dark Pools Edwin Batista, 14 J. High Tech L. 83 (2014). 
Privacy Protections Left Wanting: Looking At Doctrine And Safeguards On Law Enforcements' Use Of GPS Tracking And Cell Phone Records With A Focus On Massachusetts, Lloyd Chebaclo, 14 J. High Tech L. 120 (2014).

Monday, February 3, 2014

11th Cir. upholds pre-Jones warrantless GPS under good faith exception; precedent was 1981 beeper case

Another circuit court (the 11th) has jumped on the good faith exception bandwagon and upheld pre-Jones warrantless GPS use, finding that law enforcement reasonably relied on "binding" precedent at the time the GPS tracker was installed. The case is United States v. Ransfer, __ F.3d __ (11th Cir. 2014).

--Note: While Ransfer was pending, the 11th Cir. decided United States v. Smith, __ F.3d__ (11th Cir. 2013), which involved Ransfer's co-defendants. The court upheld warrantless GPS under the good faith exception in that case, as well, with a much more detailed explanation.

The "binding precedent" the Ransfer court cites to justify warrantless GPS tracking is United States v. Michael, 645 F.2d 252 (5th Cir. 1981) (en banc) (when the 11th Cir. was created in 1981, it incorporated 5th Cir. precedent). The court also relied on United States v. Andres, 703 F.3d 828 (5th Cir. 2013), a similar warrantless GPS case holding that police reliance on Michael was reasonable (and thus the good faith exception applied). As noted by the Ransfer court:
The Fifth Circuit recently held police could rely on Michael “[d]espite any
possible technological differences between a 1981 ‘beeper’ and the GPS device
used in this case, [because] the functionality is sufficiently similar that the agents’
reliance on Michael to install a GPS device on the truck, in light of the reasonable
suspicion of drug trafficking, was objectively reasonable.” United States v. Andres,
703 F.3d 828, 835 (5th Cir. 2013) cert. denied, 133 S. Ct. 2814 (2013). We agree
with the Fifth Circuit that Michael was clear, binding precedent that holds the
electronic tracking of a vehicle without a warrant does not violate the Fourth
Amendment, particularly where officers had reasonable suspicion the vehicle was
involved in criminal activity.
The 11th Circuit distinguished Katzin -- the recent 3rd Cir. case rejecting a good faith exception argument (see my post: Third Circuit: Warrant required for GPS tracking (Katzin); answers what Sup. Ct. reserved in Jones) -- by pointing to the police's limited use of the GPS tracker. Namely, "the GPS tracker was not used to trace the movements of Defendants. The tracking device was not used until after an armed robbery was committed and the vehicle was used to flee the scene. Then the GPS tracking device was used for a very brief period of time after the robbery to pinpoint the location of the vehicle and to dispatch police to arrest Defendants..." As I see it, the court is stating that the way GPS tracking was used here was more analogous to tracking via beeper than extended electronic surveillance; therefore, the court notes:
the technological distinctions the Third Circuit found relevant in Katzin do not apply to the facts of this case: 'Unlike GPS trackers, beepers require that the police expend resources – time and manpower – to physically follow a target vehicle.' Katzin, 2013 WL 5716367 at *6. That is exactly what occurred in this case.
Of course, I am not surprised at the outcome, given that most other circuits have held similarly. However, I think the court's attempt to distinguish Katzin is clumsy and logically questionable. While I agree that the GPS tracker was used very minimally here, it still permitted the police to pinpoint the car without expending resources (i.e. having to follow the car); that is quite different than Knotts or Michael where police had to be in range of the beeper's radio signal and thus had to surveil to some extent. Katzin ("GPS technology must be distinguished from the more primitive tracking devices of yesteryear such as 'beepers.' Beepers are nothing more than 'radio transmitter[s], usually battery operated, which emit[]  periodic signals that can be picked up by a radio receiver.' United States v. Knotts, 460 U.S. 276, 277, 103 S. Ct. 1081, 75 L. Ed. 2d 55 (1983). In contrast to GPS trackers, beepers do not independently ascertain their location — they only broadcast a signal that the police can then follow via a corresponding receiver. Moreover, beeper signals are range-limited: if the police move far enough away from the beeper, they will be unable to receive the signal that the unit broadcasts. At bottom, then, beepers are mere aids for police officers already performing surveillance of a target vehicle.")

More fundamentally, though, I reiterate my distaste for outcomes like this that shift the Supreme Court's Davis/Leon opinions from requiring good faith to something more akin to blind faith. Many of these cases give a whiff of backward reasoning coupled with deference to police; to me, the constitutional protections of the Fourth Amendment should take precedence.

For example, to buy the good faith argument, here, you have to accept the following:

1. That police were aware of the Michael precedent from 1981 at the time of the GPS installation
2. That police knew 5th Circuit precedent was binding because it was incorporated by the 11th in 1981
3. That "Michael articulated clear, binding precedent that installation of a device permitting electronic surveillance of a vehicle does not violate the Fourth Amendment" Ransfer.
4. That Michael referred to a beeper as an electronic tracking device and it is commonly understood that "a GPS device is an 'electronic tracking device'"; thus, arguing a difference in kind between beepers and GPS trackers makes "too fine a distinction." Smith, __ F.3d __ (11th Cir. 2013) (slip op. at 20).
5. That the Michael beeper and Ransfer GPS are technologically analogous, as well, notwithstanding that "the precise technological capabilities of the beeper were not explained in the [Michael] opinion." Andres, slip op. at 10.
6. That when police installed the GPS tracker without a warrant they "'followed the Eleventh Circuit’s . . . precedent to the letter.'" Smith.
Side Note: With regard to the actual capabilities of the beeper in Michael, I went back to the panel decision of and it seems that the beeper was merely used to aid in visual surveillance of the van (i.e. to allow police to stay farther away) and not to track it somewhere without any effort at all. See United States v. Michael, 622 F.2d 744 (5th Cir. 1980) ("The warehouse was located four days after installation of the beeper through following Michael's van with [the beeper's] aid."). It seems odd, then, for the court in Smith to assert that an argument that beepers and GPS trackers are not functionally similar draws "too fine a distinction.The law has an open texture, to be sure, but distinctions must be made and analogies must have limits when technology is involved - the much maligned "tiny constable" in Jones epitomizes this point.
With that in mind, I don't know if I'm willing to believe all six arguments/premises above and conclude that the police in Ransfer were (using the language from Davis v. United States, 131 S. Ct. 2419 (2011)"specifically authorized" by "unequivocal" precedent (as opposed to "interpret[ing] ambiguous precedent," a situation where the good faith exception does not apply), to place a GPS tracker on the defendant's car.

My main point is that the good faith exception is worthwhile when there is actually clear, binding precedent. Once you remove clarity (as is the case here), you begin making assumptions about individual knowledge, intent, logical extrapolation, analytical thinking, and various other mental processes. The end result is judicial deference to law enforcement at the expense of subverting constitutional protections.

This is especially true in cases where the good faith exception has been applied applied despite any appellate precedent authorizing the police activity in question. See, e.g., OH App Ct: Warrantless GPS tracking OK despite no precedent; My take on the "good" left in the good faith exception.