Built bundles are not affected. The service is supposed to figure out which polyfills are required by a particular browser and serve different scripts. Because it's serving different scripts, the scripts cannot be bundled or secured using SRI. That would defeat the purpose of the service.

Code pulled from GitHub or NPM can be audited and it behaves consistently after it has been copied. If the code has a high reputation and gets incorporated into bundles, the code in the bundles doesn't change. If the project becomes malicious, only recently created bundles are affected. This code is pulled from polyfill.io every time somebody visits the page and recently polyfill.io has been hijacked to sometimes send malicious code instead. Websites that have been up for years can be affected by this.

Docker Swarm encryption doesn't work for your use case. The documentation says that the secret is stored encrypted but can be decrypted by the swarm manager nodes and nodes running services that use the service, which both apply to your single node. If you're not having to unlock Docker Compose on startup, that means that the encrypted value and the decryption key live next to each other on the same computer and anyone who has access to the encrypted secrets can also decrypt them.

China is simultaneously destroying the environment for profit and investing too much money in green technology?

A distinctive feature of purchase subsidies for BEV in China, however, is that they are paid out directly to manufacturers rather than consumers and that they are paid only for electric vehicles produced in China, thereby discriminating against imported cars.

That's an interesting way to spin subsidies on the production of electric vehicles. Why would China pay companies in other countries to produce cars?

I looked it up before posting. It's illegal in 48 states, including California where most of these companies are headquartered, and every state where major cloud data centers are located. This makes it effectively illegal by state laws, which is the worst kind of illegal in the United States when operating a service at a national level because every state will have slightly different laws. No company is going to establish a system that allows users in the two remaining states to exchange revenge porn with each other except maybe a website established solely for that purpose. Certainly Snapchat would not.

I've noticed recently there are many reactionary laws to make illegal specific things that are already illegal or should already be illegal because of a more general law. We'd be much better off with a federal standardization of revenge porn laws than a federal law that specifically outlaws essentially the same thing but only when a specific technology is involved.

Web services and AI in general are completely different things. Web services that generate AI content want to avoid scandals so they're constantly blocking things that may be in some situations inappropriate, to the point where those services are incapable of performing a great many legitimate tasks.

Somebody running their own image generator on their own computer using the same technology is limited only by their own morals. They can train the generator on content that public services would not, and they are not constrained by prompt or output filters.

Modern AI is not capable of this. The accuracy for detecting nsfw content is not good, and they are completely incapable of detecting when nsfw content is allowable because they have no morals and they don't understand anything about people or situations besides appearance.

“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.

“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail...so he would get a punishment for what he actually did,” McAdams told CNN.

There's a reason kids are tried as kids and their records are expunged when they become adults. Undoing that will just ruin lives without lessening occurrences.

“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”

This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.

“[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim's family asks for it,” Cruz said. “Elliston's Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”

BS

It's been possible for decades for people to share embarrassing pictures of you, real or fake, on the internet. Deep fake technology is only really necessary for video.

Real or fake pornography including unwilling participants (revenge porn) is already illegal and already taken down, and because the girl is underage it's extra illegal.

Besides the legal aspect, the content described in the article, which may be an exaggeration of the actual content, is clearly in violation of Snapchat's rules and would have been taken down:

  • We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or sexual extortion (sextortion), or the sexualization of children. We report all identified instances of child sexual exploitation to authorities, including attempts to engage in such conduct. Never post, save, send, forward, distribute, or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself).
  • We prohibit promoting, distributing, or sharing pornographic content, as well as commercial activities that relate to pornography or sexual interactions (whether online or offline).
  • We prohibit bullying or harassment of any kind. This extends to all forms of sexual harassment, including sending unwanted sexually explicit, suggestive, or nude images to other users. If someone blocks you, you may not contact them from another Snapchat account.

Formerly in business website formerly known as Twitter.

Are they going to officially allow third party apps at all? The stock app is terrible, and not just because of excessive, unskippable advertising and bizarre restrictions around background play. When you search for anything, at least half of the results are completely unrelated to what you searched for in an attempt to increase user engagement metrics. It keeps trying to get you to watch shorts in its bad TikTok clone. Sometimes it recommends unrelated shorts with disturbing thumbnails in the middle of your search results. It keeps autodetecting that the video quality should be 360p on a connection easily capable of 4k, and resetting back to 360p at the start of every new video. The UI for live streams puts things on top of other things that are more important.

Bluesky is not decentralized. It's promised to be decentralized but I wouldn't be surprised if they never allow open federation.

[-] i_am_not_a_robot@discuss.tchncs.de 59 points 10 months ago

What a non-story. The username, profile picture, posts from profile, and post interactions are all required for displaying the content that the Thread's user has subscribed to. The IP address is required for connecting to the service to retrieve that content. Facebook doesn't get any more access to your data than necessary nor do they get any more access to your data than anybody else. This is just fear mongering.

view more: next ›

i_am_not_a_robot

joined 1 year ago