• 0 Posts
  • 36 Comments
Joined 9 months ago
cake
Cake day: May 14th, 2024

help-circle



  • This assumes a legitimate need to prove who you are outside the context of that specific site, rather than just within it. Sometimes that need is real, sometimes it is not.

    When it’s not, and you only need to prove you are the same person who created the account, then a simple username and password is sufficient. Use 2FA (via authenticator app or key, NOT via SMS or email) on top of that. This allows users to prove to a sufficient degree that they are the owner of that account.

    This is how most Lemmy instances work, for example. I can sign up by creating a username and password, with optional 2FA. They do not need my email. They do not need my phone number. They do not need my name, or my contacts, or anything else that is not related to my identity within their server.

    I realize that this is untenable at large scales for any communications platform. Spam (and worse) is a problem wherever there are easy and anonymous signups. I’m honestly not sure how Lemmy is as clean as it is. I guess it’s just not popular enough to attract spammers.


  • which would indicate that it’s somehow needed to generate AI-generated CSAM

    This is not strictly true in general. Generative AI is able to produce output that is not in the training data, by learning a broad range of concepts and applying them in novel ways. I can generate an image of a rollerskating astronaut even if there are no rollerskating astronauts in the training data.

    It is true that some training sets include CSAM, at least in the past. Back in 2023, researches found a few thousand such images in the LAION-5B dataset (roughly one per million images). 404 Media has an excellent article with details: https://www.404media.co/laion-datasets-removed-stanford-csam-child-abuse/

    On learning of this, LAION took down their database until it could properly cleaned. Source: https://laion.ai/notes/laion-maintenance/

    Those images were collected from the public web. LAION took steps to avoid linking to illicit content (details in the link above), but clearly it’s an imperfect system. God only knows what closed companies (OpenAI, Google, etc.) are doing. With open data sets, at least any interested parties can review, verify, and report this stuff. With closed data sets, who knows?




  • Are you able to spend a lot of money on it? Last I checked, there were a few places in the EU that had a citizenship track if you purchased substantial property. So if you’re in position to buy a nice house, that’s an option. I think Portugal is the most approachable cost-wise. But it’s been a while since I looked at this so I’m sure things have changed.

    Several countries will allow extended student visas, even if you only speak English. I think Sweden allows this.

    Then of course there’s the easy way: marry a Canadian.


  • I think it’s just for enterprise contracts, yeah.

    Fedora seems like a good general-purpose pick to me, because it is modern, it has a large community, and it’s easy enough to install and use. It has similar advantages as Ubuntu — that is, a large community and broad commercial third-party support — without the downsides of having a lot of outdated software and lacking support for new hardware. I think Fedora is less likely to have show-stopping limitations than a lot of other distros, even beginner-friendly ones like Mint.

    But that’s just one opinion. There’s nothing wrong with Ubuntu or derivatives. I’ve heard good things about Pop_OS as well, though I’ve never tried it myself.



  • That’s when Windows 10 stops getting security updates. Expect most software vendors to drop support for Windows 10 this year if they haven’t already. That doesn’t necessarily mean things will stop working, but it will not be tested and they won’t spend time fixing Win10-specific problems.

    In enterprise, you can get an additional three years of “extended security updates”. That’s your grace period to get everyone in your org upgraded.

    While I strongly relate to anyone who hates Windows 11, “continue using Windows 10 forever” was never a viable long-term strategy.

    Windows 10 was released in 2015. Ten years of support for an OS is industry-leading, on par with Red Hat or Ubuntu’s enterprise offerings and far ahead of any competing consumer OS. Apple generally only offers three years of security updates. Google provides 3-4 years of security updates. Debian gets 5 years.

    There has never been a time in the history of personal computing when using an OS for over 10 years without a major upgrade was realistic. That would be like using Windows 3.1 after XP was released. Windows 10 is dead, and it’s been a long time coming.

    Now go download Fedora.


  • Silly question perhaps, but are you sure you’re using the correct port on your Linux system? If I plug my external HD into a USB2 port, I’m stuck at 30-40MB/sec, while on a USB3 port I get ~150-180MB/sec. That’s proportionally similar to the difference you described so I wonder if that’s the culprit.

    You can verify this in a few different ways. From Terminal, if you run lsusb you’ll see a list of all your USB hubs and devices.

    It should look something like this:

    Bus 002 Device 001: ID xxxx:yyyy Linux Foundation 3.0 root hub
    Bus 002 Device 002: ID xxxx:yyyy <HDD device name>
    Bus 003 Device 001: ID xxxx:yyyy Linux Foundation 2.0 root hub
    Bus 004 Device 001: ID xxxx:yyyy Linux Foundation 3.0 root hub
    

    So you can see three hubs, one of which is 2.0 and the other two are 3.0. The HDD is on bus 002, which we can see is a USB 3.0 hub by looking at the description of Bus 002 Device 001. That’s good.

    If you see it on a 2.0 bus, or on a bus with many other devices on it, that’s bad and you should re-organize your USB devices so your low-speed peripherals (mouse, keyboard, etc.) are on a USB2 bus and only high-speed devices are on the USB3 bus.

    You can also consult your motherboard’s manual, or just look at the colors of your USB ports. By convention, gray ports are USB 1.0, blue ports are 2.0, and green ports are 3.x.

    If you’re running KDE, you can also view these details in the GUI with kinfocenter. Not sure what the Gnome equivalent is.


  • Half the movies released in 3D during the last wave were poorly done conversions not even shot for 3D.

    Only half? -_-

    I’ve only seen a few movies that were actually filmed in 3D. Even Gravity was filmed in 2D.

    The problem is that actually filming in 3D requires using different (and expensive) hardware, and different creative direction all across the board. You can’t just upgrade to a 3D camera and call it a day. Not many studios will put in that kind of effort for something that is not proven in the market. And not many filmmakers are actually skilled at working in 3D, simply due to lack of direct experience.

    I saw the Hobbit movies in high framerate 3D in the theater, and while they were not good movies, they looked absolutely amazing because they were committed 100% to the format from start to finish — not just with the hardware, but with the lighting, makeup, set design, everything. It’s a shame the movies sucked, and it’s a shame that there has never been a way to watch them in HFR 3D outside of select theaters.


  • They’re like 20 years too late to start copying Apple here. Apple had their shit together with their product line for a good while after Steve Jobs returned and eliminated the absolute insanity of Apple’s mid-90s lineup, which had at least three times more models than any sane person would find useful.

    But recently, Apple went off the deep end. Boggles the mind that “Pro Max” ever made it past the brain-mouth barrier in a boardroom, let alone into an official product lineup.




  • Yep. AGI is still science fiction. Anyone telling you otherwise is probably just trying to fool investors. Ignore anyone who is less than three degrees of separation away from a marketing department.

    The low-hanging fruit is quickly getting picked, so we’re bound to see a slowdown in advancement. And that’s a good thing. We don’t really need better language models at this point; we need better applications that use them.

    The limiting factor is not so much hardware as it is our knowledge and competence in software architecture. As a historical example, 10 short years ago, computers were nowhere near top-level at Go. Then DeepMind developed AlphaGo, which was a huge leap forward and could beat a top pro. It ran on a supercomputer cluster. Thanks to the research breakthroughs around AlphaGo, within a few years had similar AI that could run on any smartphone and could beat any human player. It’s not because consumer hardware got that much faster; it’s because we learned how to make better software. Modern Go engines are a fraction of the size of AlphaGo, and generate similar or better quality results with a tiny fraction of the operations. And it seems like we’re pretty close to the limit now. A supercomputer can’t play all that much better than my laptop.

    Similarly, a few years ago something like ChatGPT 3 needed a supercomputer. Now you can run a model with similar performance on a high-end phone, or a low-end laptop. Again, it’s not because hardware has improved; the difference is the software. My current laptop (2021 model) is older than ChatGPT 3 (publicly launched in 2022) and it can easily run superior models.

    But the returns inevitably diminish. There’s a limit somewhere. It’s hard to say exactly where, but entropy’s gonna getcha sooner or later. You simply cannot fit more than 16GB of information in a 16GB model; you can only inch closer to that theoretical limit, and specialize into smaller scopes. At some point the world will realize that trying to encode everything into a model is a dumb idea. We already have better tools for that.


  • I know this is from 2015, but even then, it was a bit late to make this argument. This was already mainstream enough in the 90s to be the punchline in syndicated comic strips. By 2015, we already had “customer experience engineers” (i.e. tier-1 helpdesk). The ship has not only sailed, it has sunk.

    Anyway, the phrase originated in an era when programming was very different from what it is today, when most programmers came from a background in electrical engineering or something along those lines.