Yeah, another AI is the end of us all post

Homepage Forums Small Talk Yeah, another AI is the end of us all post

This topic contains 32 replies, has 5 voices, and was last updated by  Simon Paynton 11 months, 3 weeks ago.

Viewing 15 posts - 1 through 15 (of 33 total)
  • Author
    Posts
  • #54067

    Unseen
    Participant

    I earned a living doing soft girls posing nude in umm alluring poses for about a decade. Back then, you needed a camera, a willing girl, and some money (oversimplified, but those were the basics). Today, with AI you can create images of even more ideally beautiful girls without a camera, a girl, or (after whatever the initial investment in software and computing power is), not much else. No disputes with models or their boyfriends, no modeling fees to pay, no 18+ age documentation to keep, etc., and you can produce results so realistic that the line between real and unreal is blurred almost if not absolutely completely.

    Take this saucy but nonpornographic example:

    It’s virtually indistinguishable from the work of a high-end professional photographer! Touches like the tuft of flyaway hair over her left ear help it look real.

    And then, consider these bikini images of sexy AI-generated “girls,” who look like they might not even be 18 yet!

    Do you see the possibilities for underage porn and the thorny legal issues that could pop up if the images of such imaginary girls went pornographic?

    BTW, one “AI lookbook” link turned up in my Youtube feed and once I clicked on it I started getting more, and as I did more research, well, I get a lot of them now.

    And, oh yes, no need to look: the people who produce these saucy images can easily also produce hardcore porn images of equal realism.

    #54070

    Unseen
    Participant

    BTW, in case the legal issue with AI images of underage “girls” or “boys” engaged in sexual activities is that the legal theory behind what makes child porn illegal is that it documents the actual abuse of an underage person. If there’s no underage person, there’s no abuse.

    Now, some psychologists might look at this and say, “Maybe we can use such images to divert potential child abusers into enjoying their proclivity legally. After all, nobody is going to go out and try to have sex after they’ve masturbated.” I’ve heard psychologists say that they don’t buy pornography as a cause of rape because men masturbate to pornography and get release. The men to worry about are those who can’t get sufficient release that way and therefor must act out.

    #54071

    _Robert_
    Participant

    Would not surprise me if all this fakery results in sophisticated analysis tools to determine what is real vs what is AI-generated. AI will leave certain detectable markers for now at least. May even raise the market demand for genuine media dealing with anything from news, politics and pornography.

    #54072

    Unseen
    Participant

    Would not surprise me if all this fakery results in sophisticated analysis tools to determine what is real vs what is AI-generated. AI will leave certain detectable markers for now at least. May even raise the market demand for genuine media dealing with anything from news, politics and pornography.

    I’m not sure why anyone would care. Not the guy with the dick in his hand. As for enforcing laws against porn involving underage girls, a whole new legal theory would have to be legislated, and I’m not sure there would be a need to, since such porn created without abusing children is a plus, in a  sense. If the legal basis is that we just don’t like people enjoying such content even if it does not harm anyone and may actually prevent harm, that’s a bankrupt rationale.

    #54073

    Here is a link from  Sunday School  May 26th that covered some of the issues found when checking fake news and AI generated ‘politicians’.

    #54074

    Unseen
    Participant

    FYI, AI created “girls” from various countries:

    #54075

    _Robert_
    Participant

    Would not surprise me if all this fakery results in sophisticated analysis tools to determine what is real vs what is AI-generated. AI will leave certain detectable markers for now at least. May even raise the market demand for genuine media dealing with anything from news, politics and pornography.

    I’m not sure why anyone would care.

    Because they get off on it being “bad” or wrong. They get off guys abusing and using women. From the safety of mommy’s basement, of course. It’s like legalizing weed. Half the heads I know barely even smoke anymore since granny can run down to the dispensary and pick up some shit stronger than anything they had before. Where is the thrill in that.

    This here, this is some REAL porn.

    #54077

    Unseen
    Participant

    Because they get off on it being “bad” or wrong. They get off guys abusing and using women. From the safety of mommy’s basement, of course. It’s like legalizing weed. Half the heads I know barely even smoke anymore since granny can run down to the dispensary and pick up some shit stronger than anything they had before. Where is the thrill in that.

    I don’t think anyone who studies pedophilia would categorize it as anything like a phase or craze. It’s a basic sexual orientation like heterosexuality or homosexuality because the many pedos say they knew early on what their preference was at a time in their life when the rest of us knew we were into chicks or guys. It’s a defective orientation, to be sure, but it’s not like bellbottom pants or mohawk hairdos or “retro” clothing. If someone becomes an active pedo after simply seeing some pedo porn, it pretty much has to be due to a latent tendency they’ve been in denial about, not an infection.

    #54078

    _Robert_
    Participant

    Because they get off on it being “bad” or wrong. They get off guys abusing and using women. From the safety of mommy’s basement, of course. It’s like legalizing weed. Half the heads I know barely even smoke anymore since granny can run down to the dispensary and pick up some shit stronger than anything they had before. Where is the thrill in that.

    I don’t think anyone who studies pedophilia would categorize it as anything like a phase or craze. It’s a basic sexual orientation like heterosexuality or homosexuality because the many pedos say they knew early on what their preference was at a time in their life when the rest of us knew we were into chicks or guys. It’s a defective orientation, to be sure, but it’s not like bellbottom pants or mohawk hairdos or “retro” clothing. If someone becomes an active pedo after simply seeing some pedo porn, it pretty much has to be due to a latent tendency they’ve been in denial about, not an infection.

    Nah. It’s often considered more violent act of someone who is mentally ill than a normal act of sexual preference. There is no legal consent that can even occur, so right there you have a fucked-up power/control situation.

    #54079

    Here is an academic paper looking at how A.I. will generate harmful social media content.

    #54080

    Unseen
    Participant

    @ Reg

    I’d really like us to stick with the original focus on the implications for porn. That’s the discussion I’m hoping to have. That article may talk ab0ut more important things, but it’s a bit off-topic here.

    #54081

    Sure, no problem Unseen.

    #54082

    Unseen
    Participant

    Sure, no problem Unseen.

    No sweat. It’s a good topic for another post. I’d likely join in. Just find a juicy article with wide appeal and start it up.

     

    #54084

    Unseen
    Participant

    Another wrinkle in this topic is fake nudes of real people, now a threat to high school teens, mostly girls.

    ‘I Felt Shameful and Fearful’: Teen Who Saw AI Fake Nudes of Herself Speaks Out

    #54085

    Simon Paynton
    Participant

    Nah. It’s often considered more violent act of someone who is mentally ill than a normal act of sexual preference.

    Mental illness doesn’t change someone’s basic personality: it couldn’t turn someone into a paedo.  It’s an act of sexual preference, but not a “normal” one.

    If there are tons of deep fake kiddie porn pictures out there – how are we to know if there are real ones in there too?  Perhaps real ones would become more sought after as, I imagine, a high proportion of child abusers are sadistic by nature and would enjoy the thought of a child actually being harmed to bring them a few photos.

    I was in mental hospital with a paedophile who had tried to kill himself after being caught with a “secret computer within a computer” so that he could look at, what I read in the paper afterwards, were pictures of the highest category of shocking and disturbing child abuse.  Many of them are extreme sadists in my opinion.  He was on his way to prison, which he wasn’t looking forward to.  He may have been an annoying twat to be around, but the consensus of opinion on the ward was “he who is without sin cast the first stone”.  I think if he’d been in contact with women patients he might have been given a harder time.

Viewing 15 posts - 1 through 15 (of 33 total)

You must be logged in to reply to this topic.