• 0 Posts
  • 134 Comments
Joined 2 years ago
cake
Cake day: June 23rd, 2023

help-circle



  • Oh, and as evidenced by the government losing control of the backdoors they implanted into telco companies, this data will be hacked. And having all of it in one place will make it a big target.

    So it’s not just the U.S. government that will know about your business, but so will the U.S.’s foreign rivals.
    Great fucking strategy. Let’s make sure all our health data is accessible, not improve the health system, and just hope that none of the people whose information you can freely and easily buy online due to lax privacy laws have expensive medical bills and a sensitive job.

    Way improve national security. Dumbasses.


  • What an unexpectedly deep bit of research this threw me into.

    In 2005 a company called Fortress Credit loaned the Trump Organization $130 million dollars for the construction of trump tower that it later ‘forgave.’ Fortress Credit is owned by Fortress Investment Group, which is owned by SoftBank. Additionally, SoftBank tried to engage with Trump in 2017 under a similar scheme, where they offered to invest in the U.S.’ IT infrastructure as part of some deal they were cooking up with Trump.
    Incidentally, in 2019, New Fortress Energy, also part of the constellation of companies, was granted a peculiar permit to transport LNG over rail lines within populated areas - something that is generally not done due to the danger involved.
    So that’s just, you know, the corruption cherry on top of this shit cake.

    So now, we have SoftBank, OpenAI, and Oracle - companies whose CEO’s ‘bent the knee’ announcing a half a trillion dollar investment into an all knowing AI medical black box that the government (and its corporate sponsors) intend to use to track all your medical information.
    Yes. Centralized government tracking of your periods, ladies. If this system ever works, the government will know if you’ve ever told a medical professional that you use illicit drugs, even drugs that may be legal at the state level, but illegal federally. The government will know if you’re on antidepressants - something that JFK Jr. wants to send people to re-education/labor camps for being on. They will know if you’ve ever told a doctor that you’re not CIS or straight.

    And we know that SoftBank can’t afford to invest that much. They took out a $4 billion loan 2 years ago, and then asked for another 1.1 billion shortly after. Even Elon Musk is saying they don’t have the money.
    So they’re going to invest some money in something, get very overpriced government contracts for the operation of it, and use a fraction of the overages from that to ‘invest’ further until their obligation is fulfilled or forgiven – in much the same way that telcos fleeced the government to build out broadband and never did.

    It’s a bad deal for all of us, and the very best outcome for anyone is that it will never work.
    Because if it does work, we will lose our medical privacy, and we lose control of any data we’ve ever shared with medical professionals - one of the few areas in U.S. citizen’s lives where there are privacy laws to keep them safe.






  • Yeah, I know. And I know there’s way more market demand for mirrorless, as well as simpler mechanicals, so they have less failure points, but do I ever love the sound and that subtle feeling of a mirror slapping up and the shutter flicking out of place.
    The feedback that offers when you capture a photo feels like you’re doing something ‘real’ when you take a photo. Everyone knows that you captured that moment. Those photons are yours forever, trapped in your little art-making box.

    It’s kind of romantic, in a way. I feel like modern tech is great, but tends to be inscrutable.







  • My organization seems to have already thrown in the AI towel, or at least are resorting to magical thinking about it

    We’re highly integrated with Microsoft - Windows Login, Active Directory, Microsoft 365, and even a managed version of Edge as the org-wide ‘default’ browser that we’re encouraged to sign into with our organizational credentials to sync account information, etc. Our AI policy is basically “You can use any Microsoft AI feature your account can access.”
    They can try to block whatever sites they want with the firewall, but once you let a user get comfortable with the idea of allowing systems to exfiltrate data, you aren’t going to also make them more discrete. They’re trusting that by throwing open the floodgates users will actually use Microsoft’s offerings instead of competing offerings — as if folks who sometimes still cannot tell the difference between a web browser and ‘the internet’ will know the difference. And they are also trusting that Microsoft is going to uphold our enterprise license agreement and their own security to keep that data within our own cloud instance.

    Boy howdy, this will be interesting.




  • Addiction has a medical definition, not a connotation.
    As previously shown, SSRI’s do not cause addiction, even if they can cause withdrawal or physical dependence for some people.

    I guess I’m wondering if support of this policy has to be riddled with asterisks and accompanied by statements that express hopes of how the programs will be run, then why express any support at all for them?

    And finally: There are safe places available for people to go if they feel they are having mental health issues that require more intensive care. Mind you, these are really only available to people with health insurance - Regan largely killed off federal and community mental health care in the 80’s. Care that cannot be replaced with a labor camp.
    The only proper replacement for that care is rebuilding that/those system(s), and that is not what RFK is proposing. He’s proposing a labor camp to take advantage of and imprison away vulnerable populations.