In today’s world, I have serious doubts I can expect anyone to get the reference in my title, so I’m going to start off by ruining the punchline on the headline right here: it’s a play on the idea of a Dickensian world, which is defined in the Cambridge Dictionary (and who should know better?) as
relating to or similar to something described in the books of the 19th-century English writer Charles Dickens, especially living or working conditions that are below an acceptable standard:
The bathrooms in this hotel are positively Dickensian – no hot water and grime everywhere.
In an old piece of news, dating back to ancient times—2013, to be exact—Annie Dookhan, described as “an ambitious state chemist,” set out to “prove her worth.” She was enabled in her quest by a set of working conditions that were below an acceptable standard.
Yeah, I know: not the world’s best joke. But at least you know I care enough to try.
Anyway, Annie loved her job, you see. But the job where Annie got her fun involved working as a chemist in the drug analysis unit that tested drug evidence submitted by law enforcement throughout the state of Massachusetts. Over an unknown period of time—she was hired in 2003, promoted in 2005, and resigned in 2012—Dookhan’s “work product was consistently the highest in the lab among her co-workers.”
There was just one problem. It appears that Dookhan’s ability to generate the “highest” work product depended upon her ability to skip the work part. And if someone started to dig a little too deeply into what she was—or, rather, wasn’t—doing, she covered her tracks with a bit of photoshopping. In other words, she created fake work product that looked enough like the real thing to pass in court. One example in the Statement of the Case went as follows:
[I]nvestigators found a discovery packet that had been emailed to a prosecutor for a pending criminal case that contained an altered test. In that packet, Dookhan submitted a print out for a test designed to quantify the drug sample. In organizing the discovery information, the defendant realized that she had not printed out, or never ran, the quantifying analysis. To cover this mistake, the defendant ran the test using that the [sic] case sample number and submitted it with the discovery packet. The defendant obliterated the date the test was run. This particular machine has no capacity to save past analyses and the print date on the bottom of the document states May 5, 2011, nearly six months after the drug samples were returned to the submitting police agency.
I used the term “photoshopping” above to describe what Dookhan did. Adobe wants to remind people that, well:
Trademarks are not verbs.
Correct: The image was enhanced using Adobe® Photoshop® software.
Incorrect: The image was photoshopped.
But, strictly speaking, that ship has sailed. The genie is out of the bottle. Words happen. And Collins Dictionary online defines “photoshop” without the registered trademark sign, thusly:
verb -shops, -shopping or -shopped
(transitive) trademark[;] to alter (a digital photograph or other image), using an image editing application, esp Adobe Photoshop
Jet ski through the tears, there’s no way you’ll ever scotch tape that broken rule back together. Xerox it all you want. Fax it to violators. “Photoshop” is something we do as much as a software product we use.
Photographs have been susceptible to photoshopping since before there was a synonym for “altering” that could be applied specifically to the photo-pictorial medium. Two images from the linked article bear insertion here.
In fact, we can go back even farther:
Photoshop has just made it one heckuva lot easier to create fake photos to tell fake stories.
Photoshop itself has allegedly played a part in helping to solve crime. Sometimes indirectly. And in at least one case I found from 2009, a Tennessee man was arrested for his Photoshop activities, and charged with “aggravated sexual exploitation of a minor” for photoshopping pictures of underage girls onto adult female bodies. This is despite the fact that the United States Supreme Court ruled in 2002 that “virtual child pornography” in which no actual children were harmed, was protected speech that did not constitute a crime. ((Lexis indicates that the Beatty case was superseded by statute as noted in United States v. Beatty, 2009 U.S. Dist. LEXIS 121473, but that does not appear to be true.)), ((Japan has recently taken a different view on that.))
Other than that, though, a somewhat-exhaustive search I’ve done of the Internet, and of case law via Lexis, reveals no instances where photoshopped images have altered the outcome of a criminal case. There have been some problematic cases, such as the Connecticut case of State v. Swinton, where bite mark evidence was overlaid using Photoshop to show a “match.” That case almost certainly has other issues more problematic (e.g., the fake science of “matching” bite marks) than the Photoshop issue. In any event, while there was a complaint that the method for “enhancing” the photos falsely increased the appearance of a match, there was no accusation that I recall of any deliberate use of Photoshop to falsify the evidence.
In short, I didn’t really find any cases—though given our modern “professional” police force, I would not be shocked to hear it existed—of deliberate falsification of evidence using Photoshop.
Which is a good thing, because while some viewers have become more sophisticated about fake photography, so has the ability to create realistic-appearing fakes. Moreover, the world really is a strange place, and sometimes things are just as they appear in photographs of them, however unbelievable that may be.
But this does not put my concerns to rest.
Using Photoshop to manipulate images in an undetectable manner still requires at least some amount of skill to pull off. Additionally, I’m just not aware of that many cases where a photograph, by itself, was enough to make, or break, the case. It’s entirely possible that the reason we haven’t seen this problem in criminal cases is that the combination of the difficulty in pulling it off, plus the probably minimal payoff it would bring, just makes it not worth the effort.
Previewing the app at the Adobe Max 2016 software expo last week, researcher Zeyu Jin from Princeton University showed just how easy it will be in the near future to manipulate and transform sound files – and in extreme cases effectively put words that were never actually said into people’s mouths.
I’m not sure why the article says, “and in extreme cases.” I saw the demo. If by “extreme,” they mean “with a little work,” that’s not it at all.
By simply copying and pasting in the text window – with no other editing techniques needed at all – Jin … changes the recording….
That’s right. Unlike with Photoshop, it does not appear that you need to have any special knowledge of how to use the program. No special tricks. No special filters. No add-ons, or plug-ins.
Adobe hasn’t explained how this technology works just yet, but the software seems to identify and log phonemes – the individual speech sounds we put together to make up words and sentences.
With the right amount of sound data on file – which Adobe says is about 20 minutes of one person talking – VoCo will have actually recorded enough of these phonemes to basically impersonate that person, by stitching them together into new word and sentence formations.
My initial reaction to seeing this last night was horror. While everyone else was talking about how neat this was, how cool this was, my only thought was that “false confessions” just took on a whole new meaning, and I wondered about the potential for their increasing use to obtain convictions.
You might trust the police. I don’t. I’ve been a criminal defense lawyer long enough to recognize that police do lie. The FBI recognizes the potential enough to suggest documentation techniques when it comes to chain of custody, and other factors, relating to digital imagery. The ability to fake evidence that depend on computers has been raised even in DNA cases. For that matter, the FBI itself has finally admitted to having essentially created an entire fake field of forensic science that resulted in virtually every “examiner” in that field giving “flawed” testimony in “almost all trials in which they offered evidence against criminal defendants over more than a two-decade period before 2000.”
The fact is, police officers lie. When they’re being honest, they admit it. Judges know it. One astonishing study made clear that even prosecutors know:
[This] study is stunning because, unlike many of the comments on this issue, Orfield’s findings are based on the views of prosecutors and judges as well as those of defense attorneys. In his survey of these three groups (which together comprised twenty-seven to forty-one individuals, depending on the question), 52% believed that at least “half of the time” the prosecutor “knows or has reason to know” that police fabricate evidence at suppression hearings, and 93%, including 89% of the prosecutors, stated that prosecutors had such knowledge of perjury “at least some of the time.” Sixty-one percent, including 50% of the state’s attorneys, believed that prosecutors know or have reason to know that police fabricate evidence in case reports, and 50% of the prosecutors believed the same with respect to warrants (despite the fact that many prosecutors refused to talk about this latter area). While close to half of all respondents believed that prosecutors “discourage” such perjury and fabrication, a greater percentage believed that they “tolerate” it, and 15% believed that prosecutors actually “encourage” it. One former prosecutor described what he called a “commonly used” technique of steering police testimony by telling officers “[i] f this happens, we win. If this happens, we lose.” Most amazingly, 29% of the respondents did not equate lying at a suppression hearing with the crime of perjury.
But even if you trust police, how do you feel about people going through nasty divorces? Again, I’ve lost count of the number of men I’ve defended against false accusations of domestic violence, especially during a breakup or divorce.
And, guess what? Not only is it difficult to win such cases in some areas—after all, our culture has devolved to the point there “innocent unless proven guilty” has become “why would victims lie?”—but women who are able to get restraining orders, or protective orders, are expressly given permission to record any interactions with the restrained person. By “expressly,” I mean the law allows it, and judges tell women that.
What are the chances some semi-intelligent, wholly-vindictive, woman is going to provide law enforcement with a recording showing a criminal threat was made against them when, in the real world, no such threat was made?
Adobe says it is aware of the potential for misuse with Project VoCo, so is already working on technologies that will make it possible to detect if a recording has been tampered with – such as embedding hidden audio watermarks, which could potentially trigger voice security features used in systems like digital banking.
Swell. I guess there’s nothing to worry about then.