Avatar

compare legs

@stackedcrates / stackedcrates.tumblr.com

biochemistry / dogs / justice is just us
Avatar
reblogged
Avatar
pseudomantis

Domesticated computers will eat a disc right out of someone’s hand but wild computers are too shy you have to leave the disc on the ground and let it walk over to it and eat it itself

how dare you leave this important pc health info in the tags

Avatar
reblogged
Avatar
reggiemess

I love it when dogs try to help but the task at hand requires zero dogs so they just kinda stand in front of you and look serious.

Avatar
reblogged

the whole point of girls doing the not like other girls thing is basically them believing that the girls around them lack an interior life. they were raised on depictions of women as shallow caricatures and they recognize themselves to be actual people with thoughts and feelings, but instead of drawing the conclusion that these depictions of women are incorrect, they draw the conclusion that those depictions of women are correct and that must mean they're smarter and have more substance than the "average" woman. to exit the not like other girls phase means recognizing that all real life women have the same level of interiority as you do, and that depictions of women in the media as consistently empty headed and frivolous are misogynistic stereotypes not grounded in reality.

i say all this to say that to exit a not like other girls phase does not necessitate taking an interest in "traditionally feminine" ways of dress and makeup, nor does it mean never saying anything negative about the beauty industry

Avatar
reblogged

hey y'all... i know i never come on here anymore, but i could really use some help right now. the pandemic ruined my mental health, which led to me losing my job at the end of june, and i've only just now gotten a new one. the pay is GREAT and i'm gonna be fine going forward but things are kinda dire right now. i'm gonna be about $300 short on rent, and there's a backlog of power bills that i need to start paying a couple hundred of now or i risk shutoff. if anyone could spare a little or reblog this, i'd be crazy grateful. for those who don't know me: im transgender, gay, jewish, and i struggle w various mental health conditions including bpd. i'm a hard worker and try my best to make the people around me feel safe and seen at all times. i just don't always take the best care of myself, which leads to situations like this one. i have a degree in English and can provide proofreading or writing services, if you understandably don't like the idea of just throwing money at someone. venmo: @adrianc95 paypal: https://paypal.me/pools/c/8BxZ3Idoll

Avatar
Avatar
peitalo

obsessed with saying “both. love wins” when asked to make a decision between two things…it is not functional and nothing is solved but u know what? love won. what else matters truly

Avatar

How copyright filters lead to wage-theft

Last week, “Marina” - a piano teacher who publishes free lessons her Piano Keys Youtube channel - celebrated her fifth anniversary by announcing that she was quitting Youtube because her meager wages were being stolen by fraudsters.

Marina posted a video with a snatch of her performance of Beethoven’s “Moonlight Sonata,” published in 1801. The composition is firmly in the public domain, and the copyright in the performance is firmly Marina’s, but it still triggered Youtube’s automated copyright filter.

A corporate entity - identified only by an alphabet soup of initialisms and cryptic LLC names - had claimed Ole Ludwig Van’s masterpiece as their own, identifying it as “Wicca Moonlight.”

Content ID, the automated Youtube filter, flagged Marina’s track as an unauthorized performance of this “Wicca Moonlight” track. Marina appealed the automated judgement, which triggered a message to this shadowy LLC asking if they agreed that no infringement had taken place.

But the LLC renewed its claim of infringement. Marina now faces several unpleasant choices:

  1. She can allow the LLC to monetize her video, stealing the meager wages she receives from the ads that appear on it
  2. She can take down her video
  3. She can provide her full name and address to Youtube in order to escalate the claim, with the possibility that her attackers will get her contact details, and with the risk that if she loses her claim, she can lose her Youtube channel

The incident was a wake-up call for Marina, who is quitting Youtube altogether, noting that it has become a place that favors grifters over creators. She’s not wrong, and it’s worth looking at how that happened.

Content ID was created to mollify the entertainment industry after Google acquired Youtube. Google would spend $100m on filtering tech that would allow rightsholders to go beyond the simple “takedown” permitted by law, and instead share in revenues from creative uses.

But it’s easy to see how this system could be abused. What if people falsely asserted copyright over works to which they had no claim? What if rightsholders rejected fair uses, especially criticism?

In a world where the ownership of creative works can take years to untangle in the courts and where judges’ fair use rulings are impossible to predict in advance, how could Google hope to get it right, especially at the vast scale of Youtube?

The impossibility of automating copyright judgments didn’t stop Google from trying to perfect its filter, adding layers of complexity until Content ID’s appeal process turned into a cod-legal system whose flowchart looks like a bowl of spaghetti.

The resulting mess firmly favors attackers (wage stealers, fraudsters, censors, bullies) over defenders (creators, critics). Attackers don’t need to waste their time making art, which leaves them with the surplus capacity to master the counterintuitive “legal” framework.

You can’t fix a system broke by complexity by adding more complexity to it. Attempts to do so only makes the system more exploitable by bad actors, like blackmailers who use fake copyright claims to extract ransoms from working creators.

But it would be a mistake to think that filterfraud was primarily a problem of shadowy scammers. The most prolific filter scammers and wage-thieves are giant music companies, like Sony Music, who claim nearly *all* classical music:

The Big Tech companies argue that they have an appeals process that can reverse these overclaims, but that process is a joke. Instagram takedowns take a few seconds to file, but *28 months* to appeal.

The entertainment industry are flagrant filternet abusers. Take Warner Chappell, whose subsidiary demonetizes videos that include the numbers “36” and “50”:

Warner Chappell are prolific copyfraudsters. For decades, they fraudulently claimed ownership over “Happy Birthday” (!):

They’re still at it - In 2020 they used a fraudulent claim to nuke a music theory video, and then a human being working on behalf of the company renewed the claim *after* being informed that they were mistaken about which song was quoted in the video:

The fact that automated copyright claims can remove material from the internet leads to a lot of sheer fuckery. In 2019, anti-fascists toyed with blaring copyrighted music at far right rallies to prevent their enemies from posting them online.

At the time, I warned that this would end badly. Just a month before, there had been a huge scandal because critics of extremist violence found that automated filters killed their videos because they featured clips of that violence:

Since then, it’s only gotten worse. The Chinese Communist Party uses copyfraud to remove critical videos from Youtube:

and so does the Beverley Hills Police Department:

But despite all that, the momentum is for *more* filtering, to remove far fuzzier categories of content. The EU’s Terror Regulation has just gone into effect, giving platforms just *one hour* to remove “terrorist” content:

The platforms have pivoted from opposing filter rules to endorsing them. Marc Zuckerberg says that he’s fine with removing legal protections for online platforms unless they have hundreds of millions of dollars to install filters.

The advocates for a filternet insist that all these problems can be solved if geeks just *nerd harder* to automate good judgment, fair appeals, and accurate attributions. This is pure wishful thinking. As is so often the case in tech policy, “wanting it badly is not enough.”

In 2019, the EU passed the Copyright Directive, whose Article1 7 is a “notice and staydown” rule requiring platforms to do instant takedowns on notice of infringement *and* to prevent content from being re-posted.

There’s no way to do this without filters, but there’s no way to make filters without violating the GDPR. The EU trying to figure out how to make it work, and the people who said this wouldn’t require filters are now claiming that filters are fine.

Automating subtle judgment calls is impossible, not just because copyright’s limitations - fair use and others - are grounded in subjective factors like “artistic intent,” but because automating a flawed process creates flaws at scale.

Remember when Jimmy Fallon broadcasted himself playing a video game? NBC automatically claimed the whole program as its copyrighted work, and thereafter, gamers who streamed themselves playing that game got automated takedowns from NBC.

The relentless expansion of proprietary rights over our virtual and physical world raises the stakes for filter errors. The new Notre Dame spire will be a copyrighted work - will filters block videos of protests in front of the cathedral?

And ever since the US’s 1976 Copyright Act abolished a registration requirement, it’s gotten harder to figure out who controls the rights to any work, so that even the “royalty free” music for Youtubers to safely use turned out to be copyrighted:

We need a new deal for content removal, one that favors working creators over wage-thieves who have the time and energy to master the crufty, complex private legal systems each platform grows for itself.

Back in 2019, Slate Future Tense commissioned me to write an sf story about how this stuff might work out in the coming years. The result, “Affordances,” is sadly still relevant today:

Here’s a podcast of the story as well:

Meanwhile, governments from Australia to the UK to Canada are adopting “Harmful Content” rules that are poised to vastly expand the filternet, insisting that it’s better than the alternative.

You are using an unsupported browser and things might not work as intended. Please make sure you're using the latest version of Chrome, Firefox, Safari, or Edge.