Jess Weatherbed
👤 PersonPodcast Appearances
a lion in a playground or something, again, nefarious, you would usually do so for a giggle. Are you really going to have that much effort to learn all the necessary skills? Pick up the tech, again, free or paid, which is expensive if you want to go down that route. Go through all that effort for a joke.
a lion in a playground or something, again, nefarious, you would usually do so for a giggle. Are you really going to have that much effort to learn all the necessary skills? Pick up the tech, again, free or paid, which is expensive if you want to go down that route. Go through all that effort for a joke.
for something that you think is going to be funny whereas now there isn't any effort required so the idea that you're changing an image with the fact that these two things are similar isn't a bad argument but it does completely ignore the fact that the accessibility and the scale of these things is the issue at hand not what's actually happening
for something that you think is going to be funny whereas now there isn't any effort required so the idea that you're changing an image with the fact that these two things are similar isn't a bad argument but it does completely ignore the fact that the accessibility and the scale of these things is the issue at hand not what's actually happening
It's such a washy answer to give, but they really exist in context, right? Girls and boys, men and women alike, with all the body image issues that they're wanting to have themselves perceived on social media to their enclosed audience as being, in some way, the ideal version of themselves.
It's such a washy answer to give, but they really exist in context, right? Girls and boys, men and women alike, with all the body image issues that they're wanting to have themselves perceived on social media to their enclosed audience as being, in some way, the ideal version of themselves.
And I don't think that they're bad for that, from being exposed to all these idealistic images over the last decade or so, ever since the advent of social media has happened, especially Instagram, which is really, really bad for this. But it is the same argument effectively as to what we're doing right now. It's how much of this are we willing to accept before it becomes problematic?
And I don't think that they're bad for that, from being exposed to all these idealistic images over the last decade or so, ever since the advent of social media has happened, especially Instagram, which is really, really bad for this. But it is the same argument effectively as to what we're doing right now. It's how much of this are we willing to accept before it becomes problematic?
Because the more this landslides into a situation of how much of this is actually reality is we can't trust the images in front of us at this point. So I think it's aligned. It was almost symptomatic in a way.
Because the more this landslides into a situation of how much of this is actually reality is we can't trust the images in front of us at this point. So I think it's aligned. It was almost symptomatic in a way.
I think it's definitely happened. Like it's firmly happened already.
I think it's definitely happened. Like it's firmly happened already.
If any celebrity uploads a photo, you can go through the comments, something like, I don't know, the Kardashians, you can go through all of their stuff and you'll have people microanalyzing the background of their images, trying to find any kind of distortion to see whether they've made their waist slimmer or their bum bigger or whatever.
If any celebrity uploads a photo, you can go through the comments, something like, I don't know, the Kardashians, you can go through all of their stuff and you'll have people microanalyzing the background of their images, trying to find any kind of distortion to see whether they've made their waist slimmer or their bum bigger or whatever.
So people are already very, very heavily scrutinising the images that they're seeing in front of them. But I don't think they understand quite on a big scale yet the level of changes that can be applied because at this point we've come to accept that body imaging in a way or body editing in these contexts is just something that people do. That people feel bad about themselves.
So people are already very, very heavily scrutinising the images that they're seeing in front of them. But I don't think they understand quite on a big scale yet the level of changes that can be applied because at this point we've come to accept that body imaging in a way or body editing in these contexts is just something that people do. That people feel bad about themselves.
They might make their teeth whiter. They might make their face smaller. All of these kind of things. I don't think people are going to rush to the...
They might make their teeth whiter. They might make their face smaller. All of these kind of things. I don't think people are going to rush to the...
realization if they're looking at a picture that the entire background might not be real or that again if someone's sharing a viral image of something that's meant to be i'm trying to think of something that's not going to be too controversial but like an explosion in a bin or something that's going to stir up local news they're not going to look at that immediately and go that's fake because why would you there has to be a narrative behind that they understand the body side of that so there is an adaptation of it definitely
realization if they're looking at a picture that the entire background might not be real or that again if someone's sharing a viral image of something that's meant to be i'm trying to think of something that's not going to be too controversial but like an explosion in a bin or something that's going to stir up local news they're not going to look at that immediately and go that's fake because why would you there has to be a narrative behind that they understand the body side of that so there is an adaptation of it definitely
I think so. Yeah. One of the most common phrases I keep seeing throughout this argument a lot is the cat is out of the bag. It's already out. You can't do anything about it. And it's very defeatist. It's almost like people half understand the scale of the issue and they can't see a future where there's going to be anything that happens about it.
I think so. Yeah. One of the most common phrases I keep seeing throughout this argument a lot is the cat is out of the bag. It's already out. You can't do anything about it. And it's very defeatist. It's almost like people half understand the scale of the issue and they can't see a future where there's going to be anything that happens about it.
So they've just discounted any kind of reality in front of them from this point. So any picture that they see online from this point is fake. People are already at that point. Definitely. I don't think a lot of people are there.
So they've just discounted any kind of reality in front of them from this point. So any picture that they see online from this point is fake. People are already at that point. Definitely. I don't think a lot of people are there.
Not in terms of, especially older relatives, when we talk about people on Facebook, like nans or aunts or stuff, sharing obviously fake images as if they were going to be real. There's the whole crab Jesus memes of early mid-journey edits and things like that. But... There aren't enough people, I think, scrutinising the right things at this point.
Not in terms of, especially older relatives, when we talk about people on Facebook, like nans or aunts or stuff, sharing obviously fake images as if they were going to be real. There's the whole crab Jesus memes of early mid-journey edits and things like that. But... There aren't enough people, I think, scrutinising the right things at this point.
They've only come to the understanding that we can change our physical appearances because they understand why they would do so. They can't really understand why people would use these tools for nefarious purposes, despite the fact that when you think about it for a second... It's quite easy and you wouldn't even have to be that evil about it.
They've only come to the understanding that we can change our physical appearances because they understand why they would do so. They can't really understand why people would use these tools for nefarious purposes, despite the fact that when you think about it for a second... It's quite easy and you wouldn't even have to be that evil about it.
I saw a picture that our colleague Chris Welch had made using the reimagined tool on the Pixel 9. And I believe it was like a roach added to a takeaway or something, a takeout. Sorry, Britishism. And immediately I was like, that's going to probably cause some problems for small businesses, right? If I could just order some food, log a complaint and say, hey, look, you've added something to my food.
I saw a picture that our colleague Chris Welch had made using the reimagined tool on the Pixel 9. And I believe it was like a roach added to a takeaway or something, a takeout. Sorry, Britishism. And immediately I was like, that's going to probably cause some problems for small businesses, right? If I could just order some food, log a complaint and say, hey, look, you've added something to my food.
I want my money back. There are so many smaller level challenges. scams and bad intentions that could be fulfilled using these tools that people wouldn't have had the energy or the effort or even the skill if we're being realistic to carry out before because it's not worth the effort. But if I can do that in five seconds, it's quite tempting for some people, I imagine. There's a slippery slope.
I want my money back. There are so many smaller level challenges. scams and bad intentions that could be fulfilled using these tools that people wouldn't have had the energy or the effort or even the skill if we're being realistic to carry out before because it's not worth the effort. But if I can do that in five seconds, it's quite tempting for some people, I imagine. There's a slippery slope.
The way that I've had it explained is that the system itself is absolutely fine. The way that it's supposed to be rolling out is fine. It makes complete sense. The problem is that you need everyone on board for it to work, which is just completely unrealistic. You would need to get people that have completely different ideologies in terms of
The way that I've had it explained is that the system itself is absolutely fine. The way that it's supposed to be rolling out is fine. It makes complete sense. The problem is that you need everyone on board for it to work, which is just completely unrealistic. You would need to get people that have completely different ideologies in terms of
pro or against generative AI technology to get on board with saying we're going to make this a robust identification system. And unless you have all of the camera makers, all of the editing software makers, all of the online platforms, not just the social media ones, but like literally everywhere where you would see an image on board with this one system.
pro or against generative AI technology to get on board with saying we're going to make this a robust identification system. And unless you have all of the camera makers, all of the editing software makers, all of the online platforms, not just the social media ones, but like literally everywhere where you would see an image on board with this one system.
I don't think it's going to make a meaningful difference. And it doesn't really solve the issue that they do have at the minute, which is how to provide that information that an image is AI generated or just edited using AI in a concise and meaningful way without giving people a wall of text, which no one is going to read, right?
I don't think it's going to make a meaningful difference. And it doesn't really solve the issue that they do have at the minute, which is how to provide that information that an image is AI generated or just edited using AI in a concise and meaningful way without giving people a wall of text, which no one is going to read, right?
I'm not going to sit there and read a paragraph of what has gone into a picture on my friend's holiday snaps or something.
I'm not going to sit there and read a paragraph of what has gone into a picture on my friend's holiday snaps or something.
Meta tried this when they did their Made With AI labels and it went pretty badly because they provided no context and photographers who'd used a couple of what sounded like pretty basic tools in Adobe Photoshop that use a very standard version of generative AI, something like background removal or object select.
Meta tried this when they did their Made With AI labels and it went pretty badly because they provided no context and photographers who'd used a couple of what sounded like pretty basic tools in Adobe Photoshop that use a very standard version of generative AI, something like background removal or object select.
The system was effectively flagging that and saying, hey, you've used Gen AI, therefore we're going to tag your entire image as being made with this stuff. And it gives the wrong intentions because people see that word AI and they immediately think, OK, fake, this entire thing is fake. So there's no nuance in it.
The system was effectively flagging that and saying, hey, you've used Gen AI, therefore we're going to tag your entire image as being made with this stuff. And it gives the wrong intentions because people see that word AI and they immediately think, OK, fake, this entire thing is fake. So there's no nuance in it.
That's a much more complicated problem to solve alongside the existing issue of already trying to get thousands of organisations and companies on board with this one system adopting it. So it's not going to be a bulletproof solution. They know that as well.
That's a much more complicated problem to solve alongside the existing issue of already trying to get thousands of organisations and companies on board with this one system adopting it. So it's not going to be a bulletproof solution. They know that as well.
The Content Authenticity Initiative were fully transparent with the fact that they said this might help, but it's not going to solve the problem. So we're kind of in a bit of a mess at the minute. There's a bit of a bind. There should have been a lot of things that were put into place before AI got to this point. And now that we're playing catch up, we're too slow.
The Content Authenticity Initiative were fully transparent with the fact that they said this might help, but it's not going to solve the problem. So we're kind of in a bit of a mess at the minute. There's a bit of a bind. There should have been a lot of things that were put into place before AI got to this point. And now that we're playing catch up, we're too slow.
And there's not a meaningful way to speed up this process at this moment.
And there's not a meaningful way to speed up this process at this moment.
It's stored as far as I'm aware on their own independent database where they're trying to set up a process where you can independently check images. That's one avenue. The other avenue was supposed to be that you would access that information through the online platforms where you've already viewed the image that may or may not be fake, right? In terms of
It's stored as far as I'm aware on their own independent database where they're trying to set up a process where you can independently check images. That's one avenue. The other avenue was supposed to be that you would access that information through the online platforms where you've already viewed the image that may or may not be fake, right? In terms of
actually manipulating that information, it's already been proved that you can do so. There are safeguards in place. I've heard that apparently if you screenshot it using certain desktop software, that metadata can still carry over. There are still systems that can recognise that what you are screenshotting using that software carries the metadata and will carry some of that over.
actually manipulating that information, it's already been proved that you can do so. There are safeguards in place. I've heard that apparently if you screenshot it using certain desktop software, that metadata can still carry over. There are still systems that can recognise that what you are screenshotting using that software carries the metadata and will carry some of that over.
But then if I were to take, I don't know, my phone out of my pocket and take a picture of something on my desktop computer, yes, the quality is going to look absolutely awful by that point. But none of that metadata is going to be present. And I still have a copy of that manipulation. So at that point, all of the data is just stripped out and there's nothing you can do.
But then if I were to take, I don't know, my phone out of my pocket and take a picture of something on my desktop computer, yes, the quality is going to look absolutely awful by that point. But none of that metadata is going to be present. And I still have a copy of that manipulation. So at that point, all of the data is just stripped out and there's nothing you can do.
Like physical watermarks as well. They've explored that. We can remove those. Samsung's own tool.
Like physical watermarks as well. They've explored that. We can remove those. Samsung's own tool.
removes that if i remember correctly it watermarks images that it makes the text to image tool that they've put on yeah samsung devices you can just use the object erase tool and remove the watermark directly in the app that made it so at the minute there isn't anything more robust it's there's a constant kind of avenue that we go down where people are finding ways around it
removes that if i remember correctly it watermarks images that it makes the text to image tool that they've put on yeah samsung devices you can just use the object erase tool and remove the watermark directly in the app that made it so at the minute there isn't anything more robust it's there's a constant kind of avenue that we go down where people are finding ways around it
You've got two main issues with that at the moment. One is speed. So even though Europe is pretty far ahead of this at the minute, they've already enacted laws which have been heavily scrutinized because of the second issue, which I'll get to in a moment.
You've got two main issues with that at the moment. One is speed. So even though Europe is pretty far ahead of this at the minute, they've already enacted laws which have been heavily scrutinized because of the second issue, which I'll get to in a moment.
But if we're talking about things like the US legal procedures, we could be waiting years for something to actually come into play that will take effect and rein in the bad actors that are using these apps for bad purposes. Because they're not inherently bad tools. You could be using them for something whimsical and absolutely innocent, but you have to separate the bad causes from the good causes.
But if we're talking about things like the US legal procedures, we could be waiting years for something to actually come into play that will take effect and rein in the bad actors that are using these apps for bad purposes. Because they're not inherently bad tools. You could be using them for something whimsical and absolutely innocent, but you have to separate the bad causes from the good causes.
And when you talk about things like deepfakes, you get onto the second issue, which is nuance. What do you consider a deepfake at this point? If I take a picture of myself on a Google Pixel and I use their face blurring tool, or I put my face onto my friend's body and I put that on Facebook and
And when you talk about things like deepfakes, you get onto the second issue, which is nuance. What do you consider a deepfake at this point? If I take a picture of myself on a Google Pixel and I use their face blurring tool, or I put my face onto my friend's body and I put that on Facebook and
Would that be counted as a deepfake for a purpose if they wanted to take me to court or any kind of legal fallout at that point? It becomes so granular of an argument that it's really difficult to put on paper what should be restricted, because at that point you're placing effectively limitations on creativity, which is difficult to do.
Would that be counted as a deepfake for a purpose if they wanted to take me to court or any kind of legal fallout at that point? It becomes so granular of an argument that it's really difficult to put on paper what should be restricted, because at that point you're placing effectively limitations on creativity, which is difficult to do.
And you've also got to worry about things like free speech around that and all the existing laws that you have to unwind. So it's not an easy or fast process. And a couple of years in our time frame is a millennia in the development time that we've seen is happening in the AI landscape. So we have no idea what could be happening with generative AI in two years at this point.
And you've also got to worry about things like free speech around that and all the existing laws that you have to unwind. So it's not an easy or fast process. And a couple of years in our time frame is a millennia in the development time that we've seen is happening in the AI landscape. So we have no idea what could be happening with generative AI in two years at this point.
Again, two years ago, we were seeing pretty shoddy. shoddy mashups being thrown together by these things. They weren't believable. They were interesting. They were like pretty fun to play around with, but they weren't necessarily convincing.
Again, two years ago, we were seeing pretty shoddy. shoddy mashups being thrown together by these things. They weren't believable. They were interesting. They were like pretty fun to play around with, but they weren't necessarily convincing.
And now we're at the point where you could spit something out on Grok on a social media platform itself and yeah, mislead thousands of people, millions of people if you wanted to. That's a completely separate thing that's happened.
And now we're at the point where you could spit something out on Grok on a social media platform itself and yeah, mislead thousands of people, millions of people if you wanted to. That's a completely separate thing that's happened.
maybe but that would dishearten me considerably as there are significantly worse things happening than whatever Elon Musk could be framed to be doing right but if we think about how a lot of these applications are already being used there are a lot of celebrities particularly female celebrities that are being deep faked particularly in sexually graphic ways and that was a big deal I
maybe but that would dishearten me considerably as there are significantly worse things happening than whatever Elon Musk could be framed to be doing right but if we think about how a lot of these applications are already being used there are a lot of celebrities particularly female celebrities that are being deep faked particularly in sexually graphic ways and that was a big deal I
I can't remember how long ago this is, my memory eludes me, but there was that big hacking incident where people actually broke into celebrities' hard drives and stole nude selfies that they'd taken of themselves. And in that circumstance, there was a lot of vitriol spewed in their direction at these celebrities for taking those pictures in the first place.
I can't remember how long ago this is, my memory eludes me, but there was that big hacking incident where people actually broke into celebrities' hard drives and stole nude selfies that they'd taken of themselves. And in that circumstance, there was a lot of vitriol spewed in their direction at these celebrities for taking those pictures in the first place.
Now that can happen completely unconsensually without them having to do so. People can just make them look nude and put them into very compromising positions online. So it would be pretty, I think, grim of all these people to be ignoring the fact that is already happening. I'm pretty sure it has already happened to them as well. I haven't looked because I'm not, you know, complete degenerate.
Now that can happen completely unconsensually without them having to do so. People can just make them look nude and put them into very compromising positions online. So it would be pretty, I think, grim of all these people to be ignoring the fact that is already happening. I'm pretty sure it has already happened to them as well. I haven't looked because I'm not, you know, complete degenerate.
But I think there is some argument to be made that a legal process is probably going to step in for this, that it's going to cause enough problems for
But I think there is some argument to be made that a legal process is probably going to step in for this, that it's going to cause enough problems for
for enough people on a smaller scale, maybe like the takeout thing that I described, that it's just going to increase the use of small scale scams and fraud, catfishing on platforms that people are going to make enough of a ruckus to go, we're sick of this and we're sick of not trusting everything that we're seeing in front of us anymore.
for enough people on a smaller scale, maybe like the takeout thing that I described, that it's just going to increase the use of small scale scams and fraud, catfishing on platforms that people are going to make enough of a ruckus to go, we're sick of this and we're sick of not trusting everything that we're seeing in front of us anymore.
And you need to do something about it because everyone is just too easily manipulated at this point.
And you need to do something about it because everyone is just too easily manipulated at this point.
The narcissism thing could absolutely work in this context.
The narcissism thing could absolutely work in this context.
I think the thing that they have to fight against is the scale thing inherently, though, even if something devastating happens to them, they are in the same boat at that point that everyone else is in, which is that if you wanted to Photoshop one of these images before, you could maybe, if you were a very skilled photo editor, you could maybe do so in 20, 30 minutes.
I think the thing that they have to fight against is the scale thing inherently, though, even if something devastating happens to them, they are in the same boat at that point that everyone else is in, which is that if you wanted to Photoshop one of these images before, you could maybe, if you were a very skilled photo editor, you could maybe do so in 20, 30 minutes.
And you would have to be very skilled to do so. If not, and you were just some regular Joe wanting to do something, again, malicious or to get money off of these weird dark web porn sites or something, you would be able to do that on a much smaller scale and much more contained way.
And you would have to be very skilled to do so. If not, and you were just some regular Joe wanting to do something, again, malicious or to get money off of these weird dark web porn sites or something, you would be able to do that on a much smaller scale and much more contained way.
So if you went through photo filtering systems to find these images and get them removed from the Internet, that was a much easier process. Now, there aren't necessarily tools from Meta or something that are churning these out. Stuff definitely slips through their guardrails. But there are generative AI systems out there that have no such guardrails.
So if you went through photo filtering systems to find these images and get them removed from the Internet, that was a much easier process. Now, there aren't necessarily tools from Meta or something that are churning these out. Stuff definitely slips through their guardrails. But there are generative AI systems out there that have no such guardrails.
They've just been built using all the existing technology that's already available and they can spit out
They've just been built using all the existing technology that's already available and they can spit out
thousands of these images if they want to if you want to be looking for again nude deep fakes of celebrities or god forbid really illegal stuff we're talking like children involved and all this kind of stuff there are violent and gore images and all sorts of things the sheer scale of it makes them incredibly difficult to remove to track and then
thousands of these images if they want to if you want to be looking for again nude deep fakes of celebrities or god forbid really illegal stuff we're talking like children involved and all this kind of stuff there are violent and gore images and all sorts of things the sheer scale of it makes them incredibly difficult to remove to track and then
Because everything is online at this point, no one is using their real identities. How do you find these people to prosecute them? And how can you expect a platform to reasonably police that if they've proven in some kind of way in court that they have actually used all of the tools at their current disposal, which are inadequate, and we know they are inadequate, to address the situation?
Because everything is online at this point, no one is using their real identities. How do you find these people to prosecute them? And how can you expect a platform to reasonably police that if they've proven in some kind of way in court that they have actually used all of the tools at their current disposal, which are inadequate, and we know they are inadequate, to address the situation?
So it's kind of almost like we're just going around in circles at this point. The problem is getting progressively better and the tools that we have to address it are not.
So it's kind of almost like we're just going around in circles at this point. The problem is getting progressively better and the tools that we have to address it are not.
I can definitely see the pros. I'm not pooping the idea of generative AI that I think it should all be burned to the ground and scrapped forever. And I don't think that's even possible anyway, just because someone will rebuild these systems. You're going to get them cropping up, right?
I can definitely see the pros. I'm not pooping the idea of generative AI that I think it should all be burned to the ground and scrapped forever. And I don't think that's even possible anyway, just because someone will rebuild these systems. You're going to get them cropping up, right?
If Photoshop disappeared tomorrow, there's a million other photo editing tools that can replace it if you try hard enough. The bigger issue is that even the good it's doing, which it's definitely helping efficiency and there are applications for it that are beneficial. I think the magic arrays tool that was on, well, that is on Google Pixel phones is still a fantastic thing.
If Photoshop disappeared tomorrow, there's a million other photo editing tools that can replace it if you try hard enough. The bigger issue is that even the good it's doing, which it's definitely helping efficiency and there are applications for it that are beneficial. I think the magic arrays tool that was on, well, that is on Google Pixel phones is still a fantastic thing.
There are things in the background of photos that you don't want to have there. And getting rid of that isn't something that's... you should feel guilty about or is necessarily going to cause problems in the vast majority of situations. So there are definitely use cases where they're useful and especially for like big industries and stuff.
There are things in the background of photos that you don't want to have there. And getting rid of that isn't something that's... you should feel guilty about or is necessarily going to cause problems in the vast majority of situations. So there are definitely use cases where they're useful and especially for like big industries and stuff.
I personally don't like the argument where it's like it boosts efficiency for creatives. Because a lot of creators will look at these things and go, but it's taking the creativity out of what I do. It's doing it for me. But if the end of the day, all you want is a marketing picture to slap on an ad and then maybe change a couple of backgrounds or something. Yeah, this is helpful.
I personally don't like the argument where it's like it boosts efficiency for creatives. Because a lot of creators will look at these things and go, but it's taking the creativity out of what I do. It's doing it for me. But if the end of the day, all you want is a marketing picture to slap on an ad and then maybe change a couple of backgrounds or something. Yeah, this is helpful.
It can do that a lot quicker than a human can. Absolutely. So there's financial benefits to it completely. Yeah.
It can do that a lot quicker than a human can. Absolutely. So there's financial benefits to it completely. Yeah.
creatively I think that's where we're going to have some issues and then that leads on to the free speech argument and everything else and it all landslides from there so I have trouble at the minute trying to find if there is an adequate balance because every intention I've seen for it that's affected me as a person has just made me want to get rid of it
creatively I think that's where we're going to have some issues and then that leads on to the free speech argument and everything else and it all landslides from there so I have trouble at the minute trying to find if there is an adequate balance because every intention I've seen for it that's affected me as a person has just made me want to get rid of it
Every time I'm on Pinterest and I want to find a picture of haircut examples of something, drawing references, and I look hard enough and I'm like, oh, God's sake, this is generative AI and it's all unrealistic. And I've only just noticed after looking at it for three seconds. There's not even an effective way to filter that stuff out at the minute because we can't identify these images.
Every time I'm on Pinterest and I want to find a picture of haircut examples of something, drawing references, and I look hard enough and I'm like, oh, God's sake, this is generative AI and it's all unrealistic. And I've only just noticed after looking at it for three seconds. There's not even an effective way to filter that stuff out at the minute because we can't identify these images.
So for me, it's making my online experience definitely worse. And I know that's the same for a lot of people that do just like to enjoy the Internet at most times.
So for me, it's making my online experience definitely worse. And I know that's the same for a lot of people that do just like to enjoy the Internet at most times.
I think it would be an easier process and one that people were less likely to tamper with. The Content Authenticity Initiative already does this with the exact same system it's using to try and identify generative area images, right? And if we're talking about press images, that is really useful.
I think it would be an easier process and one that people were less likely to tamper with. The Content Authenticity Initiative already does this with the exact same system it's using to try and identify generative area images, right? And if we're talking about press images, that is really useful.
So there are certain Sony cameras and I believe a Leica one as well that will automatically record that at the moment you take a picture. And if you were then to upload it to something like Getty or... There's some big media agencies as well, like I believe the New York Times has this.
So there are certain Sony cameras and I believe a Leica one as well that will automatically record that at the moment you take a picture. And if you were then to upload it to something like Getty or... There's some big media agencies as well, like I believe the New York Times has this.
They will automatically register those flags and be like, cool, we've noticed that you haven't tampered with this. We're going to publish this as a documentative image to prove that something has happened and it has been untampered with. Difficulty then comes that there aren't really online platforms doing that.
They will automatically register those flags and be like, cool, we've noticed that you haven't tampered with this. We're going to publish this as a documentative image to prove that something has happened and it has been untampered with. Difficulty then comes that there aren't really online platforms doing that.
Everyone's focused on trying to identify which images are fake and not which ones are real. And then when you get to the online platform situation, you get the general public involved. And the general public is where people are going to start messing with things because... They have different goals. They might have some strange biases and want to manipulate or mislead people intentionally.
Everyone's focused on trying to identify which images are fake and not which ones are real. And then when you get to the online platform situation, you get the general public involved. And the general public is where people are going to start messing with things because... They have different goals. They might have some strange biases and want to manipulate or mislead people intentionally.
So there's, I think, an argument to be made here that the preservation of the relationship between photography and journalism is going to be really important going forward because that is a much smaller technological bridge to keep moderated.
So there's, I think, an argument to be made here that the preservation of the relationship between photography and journalism is going to be really important going forward because that is a much smaller technological bridge to keep moderated.
If you are having an established relationship between people that are documenting these things with photos and then sending them to media agencies to be reported as news, that is as much smaller space to be policing whether something is real. But yeah, as soon as you get any kind of like open source thing involved, it's just going to go to mess.
If you are having an established relationship between people that are documenting these things with photos and then sending them to media agencies to be reported as news, that is as much smaller space to be policing whether something is real. But yeah, as soon as you get any kind of like open source thing involved, it's just going to go to mess.
If we're taking it as a technical argument, it's not incorrect. Photoshop has been able to make edits like this for a very, very long time, but it completely ignores the main issue of all of this, which is scale.
If we're taking it as a technical argument, it's not incorrect. Photoshop has been able to make edits like this for a very, very long time, but it completely ignores the main issue of all of this, which is scale.
I think you could probably point to that just being the single aggregator that makes all this worse, which is if you wanted to do this kind of thing in Photoshop or any editing software, really, there were so many skill and financial barriers stopping the general populace from doing so, which usually meant that those edits had to be done with intent. That could have been good or bad intent.
I think you could probably point to that just being the single aggregator that makes all this worse, which is if you wanted to do this kind of thing in Photoshop or any editing software, really, there were so many skill and financial barriers stopping the general populace from doing so, which usually meant that those edits had to be done with intent. That could have been good or bad intent.
It just meant that there was a little bit more of a thought process behind it. You had to invest in Photoshop or find a free version of it, learn how to use all of the complex tools. And it would maybe take you, I don't know, like 20 minutes, maybe an hour sometimes to make a very photorealistic manipulation that you could use anywhere.
It just meant that there was a little bit more of a thought process behind it. You had to invest in Photoshop or find a free version of it, learn how to use all of the complex tools. And it would maybe take you, I don't know, like 20 minutes, maybe an hour sometimes to make a very photorealistic manipulation that you could use anywhere.
For nefarious purposes, if you wanted to, that way inclined, AI kind of scraps all of that. It's now landed on phones and web apps, and you can just open a window, tell it what you want to see, and it'll put it there for you. It completely changes the entire landscape of what we've been dealing with.
For nefarious purposes, if you wanted to, that way inclined, AI kind of scraps all of that. It's now landed on phones and web apps, and you can just open a window, tell it what you want to see, and it'll put it there for you. It completely changes the entire landscape of what we've been dealing with.
The accessibility is the main thing. And there's going to be a lot of, especially the stuff that I think people are concerned about, right? Like the kind of like slop that you're seeing on X or Twitter or whatever we're calling it these days. You're not going to go onto Reddit and go, I want to make a very memeable picture of a politician doing something grotesque. You're going to get pushed back.
The accessibility is the main thing. And there's going to be a lot of, especially the stuff that I think people are concerned about, right? Like the kind of like slop that you're seeing on X or Twitter or whatever we're calling it these days. You're not going to go onto Reddit and go, I want to make a very memeable picture of a politician doing something grotesque. You're going to get pushed back.
You might get banned from that platform. You're going to be barred from contacting those people in general. If you're given the means to just... enact that without any barriers, which is effectively what this tech is doing. Half of the advertising for this is you can recreate anything that you can imagine.
You might get banned from that platform. You're going to be barred from contacting those people in general. If you're given the means to just... enact that without any barriers, which is effectively what this tech is doing. Half of the advertising for this is you can recreate anything that you can imagine.
I think that was pretty much Google's entire kind of ad campaign for this to the point where they've called their latest tool Reimagine. Like it's meant to be a creativity thing. People don't always have good imagination intentions. And yeah, describing that to an actual human being is going to be uncomfortable or potentially illegal in some cases. Those barriers are just completely removed.
I think that was pretty much Google's entire kind of ad campaign for this to the point where they've called their latest tool Reimagine. Like it's meant to be a creativity thing. People don't always have good imagination intentions. And yeah, describing that to an actual human being is going to be uncomfortable or potentially illegal in some cases. Those barriers are just completely removed.
AI, again, has no qualms about whether you should be asking it to make something. It's just going to do it.
AI, again, has no qualms about whether you should be asking it to make something. It's just going to do it.
I think so. I think the barrier for who was being convinced by this stuff before was there was still a good chunk of people that would see something within a split second and just automatically assume that, yeah, it's accurate without actually looking at the fact that they've got, I don't know, eight fingers on each hand or something, even in the early days of image generation.
I think so. I think the barrier for who was being convinced by this stuff before was there was still a good chunk of people that would see something within a split second and just automatically assume that, yeah, it's accurate without actually looking at the fact that they've got, I don't know, eight fingers on each hand or something, even in the early days of image generation.
But the improvement to the general technology has definitely exacerbated it. And I think there is also an error of people will just believe something with a narrative if it works for them anyway. Yeah. they don't actually need that much substantive evidence towards it.
But the improvement to the general technology has definitely exacerbated it. And I think there is also an error of people will just believe something with a narrative if it works for them anyway. Yeah. they don't actually need that much substantive evidence towards it.
If it's somewhere aligned on a position that they already have and they can use that as an illustrative guide, they're just going to run with it. So I do think it's making things considerably worse now that it's convincing the people that used to actually try and keep an eye out for obvious fakes. But yeah, it's definitely exacerbating the issue considerably.
If it's somewhere aligned on a position that they already have and they can use that as an illustrative guide, they're just going to run with it. So I do think it's making things considerably worse now that it's convincing the people that used to actually try and keep an eye out for obvious fakes. But yeah, it's definitely exacerbating the issue considerably.
No, not at all. I think the Photoshop argument itself almost feels outdated already because it's become synonymous with the act of image editing in general. But the actual software itself was a barrier that hasn't really existed for a little while now since we've started having filtering apps and Facetune and stuff on our mobile phones. What it did was create a real societal problem.
No, not at all. I think the Photoshop argument itself almost feels outdated already because it's become synonymous with the act of image editing in general. But the actual software itself was a barrier that hasn't really existed for a little while now since we've started having filtering apps and Facetune and stuff on our mobile phones. What it did was create a real societal problem.
As soon as we had Photoshop introduced, there became this idea that we need to be chasing perfection in everything that we do. And that was pushed in marketing images and that was like negatively impacting body image and stuff. But it was also this idea that a picture should be perfect.
As soon as we had Photoshop introduced, there became this idea that we need to be chasing perfection in everything that we do. And that was pushed in marketing images and that was like negatively impacting body image and stuff. But it was also this idea that a picture should be perfect.
And you should maybe feel bad if it's not or that you're less if you're not able to take stunning images like that. And ever since we've had these filtering applications, which were incredibly limited, you could make yourself have smoother skin, maybe slimmer jawline. They weren't massive changes.
And you should maybe feel bad if it's not or that you're less if you're not able to take stunning images like that. And ever since we've had these filtering applications, which were incredibly limited, you could make yourself have smoother skin, maybe slimmer jawline. They weren't massive changes.
I think it's become this kind of thing that if we're given a tool now that can effectively run with the limits of our imagination, people are going to do so.
I think it's become this kind of thing that if we're given a tool now that can effectively run with the limits of our imagination, people are going to do so.
I think if we're talking about this specific conversation, there's definitely a line. If we're saying Photoshop itself, that was a revolutionary technology, right? That was what we're saying kicked all of this off on a digital aspect. Photo manipulation existed for way longer than that, but that was incredibly difficult. That wasn't just technical know-how, that was physical skills.
I think if we're talking about this specific conversation, there's definitely a line. If we're saying Photoshop itself, that was a revolutionary technology, right? That was what we're saying kicked all of this off on a digital aspect. Photo manipulation existed for way longer than that, but that was incredibly difficult. That wasn't just technical know-how, that was physical skills.
You had to know how to cut tiny little film rolls and be able to manipulate them in magnified kind of situations, whereas Photoshop enabled you to do that without having to have all the expensive tools. It was still expensive software, but it's gradually become more accessible. I completely agree with you.
You had to know how to cut tiny little film rolls and be able to manipulate them in magnified kind of situations, whereas Photoshop enabled you to do that without having to have all the expensive tools. It was still expensive software, but it's gradually become more accessible. I completely agree with you.
There's a difference between almost manipulating something that had substance to begin with and just creating something that is a complete false reality. It never existed to begin with. And there's something to be said about intent there, which is why I think accessibility is one of the more significant concerns. Because even if you wanted to Photoshop an image or something, I don't know, like,
There's a difference between almost manipulating something that had substance to begin with and just creating something that is a complete false reality. It never existed to begin with. And there's something to be said about intent there, which is why I think accessibility is one of the more significant concerns. Because even if you wanted to Photoshop an image or something, I don't know, like,
a lion in a playground or something, again, nefarious, you would usually do so for a giggle. Are you really going to have that much effort to learn all the necessary skills? Pick up the tech, again, free or paid, which is expensive if you want to go down that route. Go through all that effort for a joke.
for something that you think is going to be funny whereas now there isn't any effort required so the idea that you're changing an image with the fact that these two things are similar isn't a bad argument but it does completely ignore the fact that the accessibility and the scale of these things is the issue at hand not what's actually happening
It's such a washy answer to give, but they really exist in context, right? Girls and boys, men and women alike, with all the body image issues that they're wanting to have themselves perceived on social media to their enclosed audience as being, in some way, the ideal version of themselves.
And I don't think that they're bad for that, from being exposed to all these idealistic images over the last decade or so, ever since the advent of social media has happened, especially Instagram, which is really, really bad for this. But it is the same argument effectively as to what we're doing right now. It's how much of this are we willing to accept before it becomes problematic?
Because the more this landslides into a situation of how much of this is actually reality is we can't trust the images in front of us at this point. So I think it's aligned. It was almost symptomatic in a way.
I think it's definitely happened. Like it's firmly happened already.
If any celebrity uploads a photo, you can go through the comments, something like, I don't know, the Kardashians, you can go through all of their stuff and you'll have people microanalyzing the background of their images, trying to find any kind of distortion to see whether they've made their waist slimmer or their bum bigger or whatever.
So people are already very, very heavily scrutinising the images that they're seeing in front of them. But I don't think they understand quite on a big scale yet the level of changes that can be applied because at this point we've come to accept that body imaging in a way or body editing in these contexts is just something that people do. That people feel bad about themselves.
They might make their teeth whiter. They might make their face smaller. All of these kind of things. I don't think people are going to rush to the...
realization if they're looking at a picture that the entire background might not be real or that again if someone's sharing a viral image of something that's meant to be i'm trying to think of something that's not going to be too controversial but like an explosion in a bin or something that's going to stir up local news they're not going to look at that immediately and go that's fake because why would you there has to be a narrative behind that they understand the body side of that so there is an adaptation of it definitely
I think so. Yeah. One of the most common phrases I keep seeing throughout this argument a lot is the cat is out of the bag. It's already out. You can't do anything about it. And it's very defeatist. It's almost like people half understand the scale of the issue and they can't see a future where there's going to be anything that happens about it.
So they've just discounted any kind of reality in front of them from this point. So any picture that they see online from this point is fake. People are already at that point. Definitely. I don't think a lot of people are there.
Not in terms of, especially older relatives, when we talk about people on Facebook, like nans or aunts or stuff, sharing obviously fake images as if they were going to be real. There's the whole crab Jesus memes of early mid-journey edits and things like that. But... There aren't enough people, I think, scrutinising the right things at this point.
They've only come to the understanding that we can change our physical appearances because they understand why they would do so. They can't really understand why people would use these tools for nefarious purposes, despite the fact that when you think about it for a second... It's quite easy and you wouldn't even have to be that evil about it.
I saw a picture that our colleague Chris Welch had made using the reimagined tool on the Pixel 9. And I believe it was like a roach added to a takeaway or something, a takeout. Sorry, Britishism. And immediately I was like, that's going to probably cause some problems for small businesses, right? If I could just order some food, log a complaint and say, hey, look, you've added something to my food.
I want my money back. There are so many smaller level challenges. scams and bad intentions that could be fulfilled using these tools that people wouldn't have had the energy or the effort or even the skill if we're being realistic to carry out before because it's not worth the effort. But if I can do that in five seconds, it's quite tempting for some people, I imagine. There's a slippery slope.
The way that I've had it explained is that the system itself is absolutely fine. The way that it's supposed to be rolling out is fine. It makes complete sense. The problem is that you need everyone on board for it to work, which is just completely unrealistic. You would need to get people that have completely different ideologies in terms of
pro or against generative AI technology to get on board with saying we're going to make this a robust identification system. And unless you have all of the camera makers, all of the editing software makers, all of the online platforms, not just the social media ones, but like literally everywhere where you would see an image on board with this one system.
I don't think it's going to make a meaningful difference. And it doesn't really solve the issue that they do have at the minute, which is how to provide that information that an image is AI generated or just edited using AI in a concise and meaningful way without giving people a wall of text, which no one is going to read, right?
I'm not going to sit there and read a paragraph of what has gone into a picture on my friend's holiday snaps or something.
Meta tried this when they did their Made With AI labels and it went pretty badly because they provided no context and photographers who'd used a couple of what sounded like pretty basic tools in Adobe Photoshop that use a very standard version of generative AI, something like background removal or object select.
The system was effectively flagging that and saying, hey, you've used Gen AI, therefore we're going to tag your entire image as being made with this stuff. And it gives the wrong intentions because people see that word AI and they immediately think, OK, fake, this entire thing is fake. So there's no nuance in it.
That's a much more complicated problem to solve alongside the existing issue of already trying to get thousands of organisations and companies on board with this one system adopting it. So it's not going to be a bulletproof solution. They know that as well.
The Content Authenticity Initiative were fully transparent with the fact that they said this might help, but it's not going to solve the problem. So we're kind of in a bit of a mess at the minute. There's a bit of a bind. There should have been a lot of things that were put into place before AI got to this point. And now that we're playing catch up, we're too slow.
And there's not a meaningful way to speed up this process at this moment.
It's stored as far as I'm aware on their own independent database where they're trying to set up a process where you can independently check images. That's one avenue. The other avenue was supposed to be that you would access that information through the online platforms where you've already viewed the image that may or may not be fake, right? In terms of
actually manipulating that information, it's already been proved that you can do so. There are safeguards in place. I've heard that apparently if you screenshot it using certain desktop software, that metadata can still carry over. There are still systems that can recognise that what you are screenshotting using that software carries the metadata and will carry some of that over.
But then if I were to take, I don't know, my phone out of my pocket and take a picture of something on my desktop computer, yes, the quality is going to look absolutely awful by that point. But none of that metadata is going to be present. And I still have a copy of that manipulation. So at that point, all of the data is just stripped out and there's nothing you can do.
Like physical watermarks as well. They've explored that. We can remove those. Samsung's own tool.
removes that if i remember correctly it watermarks images that it makes the text to image tool that they've put on yeah samsung devices you can just use the object erase tool and remove the watermark directly in the app that made it so at the minute there isn't anything more robust it's there's a constant kind of avenue that we go down where people are finding ways around it
You've got two main issues with that at the moment. One is speed. So even though Europe is pretty far ahead of this at the minute, they've already enacted laws which have been heavily scrutinized because of the second issue, which I'll get to in a moment.
But if we're talking about things like the US legal procedures, we could be waiting years for something to actually come into play that will take effect and rein in the bad actors that are using these apps for bad purposes. Because they're not inherently bad tools. You could be using them for something whimsical and absolutely innocent, but you have to separate the bad causes from the good causes.
And when you talk about things like deepfakes, you get onto the second issue, which is nuance. What do you consider a deepfake at this point? If I take a picture of myself on a Google Pixel and I use their face blurring tool, or I put my face onto my friend's body and I put that on Facebook and
Would that be counted as a deepfake for a purpose if they wanted to take me to court or any kind of legal fallout at that point? It becomes so granular of an argument that it's really difficult to put on paper what should be restricted, because at that point you're placing effectively limitations on creativity, which is difficult to do.
And you've also got to worry about things like free speech around that and all the existing laws that you have to unwind. So it's not an easy or fast process. And a couple of years in our time frame is a millennia in the development time that we've seen is happening in the AI landscape. So we have no idea what could be happening with generative AI in two years at this point.
Again, two years ago, we were seeing pretty shoddy. shoddy mashups being thrown together by these things. They weren't believable. They were interesting. They were like pretty fun to play around with, but they weren't necessarily convincing.
And now we're at the point where you could spit something out on Grok on a social media platform itself and yeah, mislead thousands of people, millions of people if you wanted to. That's a completely separate thing that's happened.
maybe but that would dishearten me considerably as there are significantly worse things happening than whatever Elon Musk could be framed to be doing right but if we think about how a lot of these applications are already being used there are a lot of celebrities particularly female celebrities that are being deep faked particularly in sexually graphic ways and that was a big deal I
I can't remember how long ago this is, my memory eludes me, but there was that big hacking incident where people actually broke into celebrities' hard drives and stole nude selfies that they'd taken of themselves. And in that circumstance, there was a lot of vitriol spewed in their direction at these celebrities for taking those pictures in the first place.
Now that can happen completely unconsensually without them having to do so. People can just make them look nude and put them into very compromising positions online. So it would be pretty, I think, grim of all these people to be ignoring the fact that is already happening. I'm pretty sure it has already happened to them as well. I haven't looked because I'm not, you know, complete degenerate.
But I think there is some argument to be made that a legal process is probably going to step in for this, that it's going to cause enough problems for
for enough people on a smaller scale, maybe like the takeout thing that I described, that it's just going to increase the use of small scale scams and fraud, catfishing on platforms that people are going to make enough of a ruckus to go, we're sick of this and we're sick of not trusting everything that we're seeing in front of us anymore.
And you need to do something about it because everyone is just too easily manipulated at this point.
The narcissism thing could absolutely work in this context.
I think the thing that they have to fight against is the scale thing inherently, though, even if something devastating happens to them, they are in the same boat at that point that everyone else is in, which is that if you wanted to Photoshop one of these images before, you could maybe, if you were a very skilled photo editor, you could maybe do so in 20, 30 minutes.
And you would have to be very skilled to do so. If not, and you were just some regular Joe wanting to do something, again, malicious or to get money off of these weird dark web porn sites or something, you would be able to do that on a much smaller scale and much more contained way.
So if you went through photo filtering systems to find these images and get them removed from the Internet, that was a much easier process. Now, there aren't necessarily tools from Meta or something that are churning these out. Stuff definitely slips through their guardrails. But there are generative AI systems out there that have no such guardrails.
They've just been built using all the existing technology that's already available and they can spit out
thousands of these images if they want to if you want to be looking for again nude deep fakes of celebrities or god forbid really illegal stuff we're talking like children involved and all this kind of stuff there are violent and gore images and all sorts of things the sheer scale of it makes them incredibly difficult to remove to track and then
Because everything is online at this point, no one is using their real identities. How do you find these people to prosecute them? And how can you expect a platform to reasonably police that if they've proven in some kind of way in court that they have actually used all of the tools at their current disposal, which are inadequate, and we know they are inadequate, to address the situation?
So it's kind of almost like we're just going around in circles at this point. The problem is getting progressively better and the tools that we have to address it are not.
I can definitely see the pros. I'm not pooping the idea of generative AI that I think it should all be burned to the ground and scrapped forever. And I don't think that's even possible anyway, just because someone will rebuild these systems. You're going to get them cropping up, right?
If Photoshop disappeared tomorrow, there's a million other photo editing tools that can replace it if you try hard enough. The bigger issue is that even the good it's doing, which it's definitely helping efficiency and there are applications for it that are beneficial. I think the magic arrays tool that was on, well, that is on Google Pixel phones is still a fantastic thing.
There are things in the background of photos that you don't want to have there. And getting rid of that isn't something that's... you should feel guilty about or is necessarily going to cause problems in the vast majority of situations. So there are definitely use cases where they're useful and especially for like big industries and stuff.
I personally don't like the argument where it's like it boosts efficiency for creatives. Because a lot of creators will look at these things and go, but it's taking the creativity out of what I do. It's doing it for me. But if the end of the day, all you want is a marketing picture to slap on an ad and then maybe change a couple of backgrounds or something. Yeah, this is helpful.
It can do that a lot quicker than a human can. Absolutely. So there's financial benefits to it completely. Yeah.
creatively I think that's where we're going to have some issues and then that leads on to the free speech argument and everything else and it all landslides from there so I have trouble at the minute trying to find if there is an adequate balance because every intention I've seen for it that's affected me as a person has just made me want to get rid of it
Every time I'm on Pinterest and I want to find a picture of haircut examples of something, drawing references, and I look hard enough and I'm like, oh, God's sake, this is generative AI and it's all unrealistic. And I've only just noticed after looking at it for three seconds. There's not even an effective way to filter that stuff out at the minute because we can't identify these images.
So for me, it's making my online experience definitely worse. And I know that's the same for a lot of people that do just like to enjoy the Internet at most times.
I think it would be an easier process and one that people were less likely to tamper with. The Content Authenticity Initiative already does this with the exact same system it's using to try and identify generative area images, right? And if we're talking about press images, that is really useful.
So there are certain Sony cameras and I believe a Leica one as well that will automatically record that at the moment you take a picture. And if you were then to upload it to something like Getty or... There's some big media agencies as well, like I believe the New York Times has this.
They will automatically register those flags and be like, cool, we've noticed that you haven't tampered with this. We're going to publish this as a documentative image to prove that something has happened and it has been untampered with. Difficulty then comes that there aren't really online platforms doing that.
Everyone's focused on trying to identify which images are fake and not which ones are real. And then when you get to the online platform situation, you get the general public involved. And the general public is where people are going to start messing with things because... They have different goals. They might have some strange biases and want to manipulate or mislead people intentionally.
So there's, I think, an argument to be made here that the preservation of the relationship between photography and journalism is going to be really important going forward because that is a much smaller technological bridge to keep moderated.
If you are having an established relationship between people that are documenting these things with photos and then sending them to media agencies to be reported as news, that is as much smaller space to be policing whether something is real. But yeah, as soon as you get any kind of like open source thing involved, it's just going to go to mess.
If we're taking it as a technical argument, it's not incorrect. Photoshop has been able to make edits like this for a very, very long time, but it completely ignores the main issue of all of this, which is scale.
I think you could probably point to that just being the single aggregator that makes all this worse, which is if you wanted to do this kind of thing in Photoshop or any editing software, really, there were so many skill and financial barriers stopping the general populace from doing so, which usually meant that those edits had to be done with intent. That could have been good or bad intent.
It just meant that there was a little bit more of a thought process behind it. You had to invest in Photoshop or find a free version of it, learn how to use all of the complex tools. And it would maybe take you, I don't know, like 20 minutes, maybe an hour sometimes to make a very photorealistic manipulation that you could use anywhere.
For nefarious purposes, if you wanted to, that way inclined, AI kind of scraps all of that. It's now landed on phones and web apps, and you can just open a window, tell it what you want to see, and it'll put it there for you. It completely changes the entire landscape of what we've been dealing with.
The accessibility is the main thing. And there's going to be a lot of, especially the stuff that I think people are concerned about, right? Like the kind of like slop that you're seeing on X or Twitter or whatever we're calling it these days. You're not going to go onto Reddit and go, I want to make a very memeable picture of a politician doing something grotesque. You're going to get pushed back.
You might get banned from that platform. You're going to be barred from contacting those people in general. If you're given the means to just... enact that without any barriers, which is effectively what this tech is doing. Half of the advertising for this is you can recreate anything that you can imagine.
I think that was pretty much Google's entire kind of ad campaign for this to the point where they've called their latest tool Reimagine. Like it's meant to be a creativity thing. People don't always have good imagination intentions. And yeah, describing that to an actual human being is going to be uncomfortable or potentially illegal in some cases. Those barriers are just completely removed.
AI, again, has no qualms about whether you should be asking it to make something. It's just going to do it.
I think so. I think the barrier for who was being convinced by this stuff before was there was still a good chunk of people that would see something within a split second and just automatically assume that, yeah, it's accurate without actually looking at the fact that they've got, I don't know, eight fingers on each hand or something, even in the early days of image generation.
But the improvement to the general technology has definitely exacerbated it. And I think there is also an error of people will just believe something with a narrative if it works for them anyway. Yeah. they don't actually need that much substantive evidence towards it.
If it's somewhere aligned on a position that they already have and they can use that as an illustrative guide, they're just going to run with it. So I do think it's making things considerably worse now that it's convincing the people that used to actually try and keep an eye out for obvious fakes. But yeah, it's definitely exacerbating the issue considerably.
No, not at all. I think the Photoshop argument itself almost feels outdated already because it's become synonymous with the act of image editing in general. But the actual software itself was a barrier that hasn't really existed for a little while now since we've started having filtering apps and Facetune and stuff on our mobile phones. What it did was create a real societal problem.
As soon as we had Photoshop introduced, there became this idea that we need to be chasing perfection in everything that we do. And that was pushed in marketing images and that was like negatively impacting body image and stuff. But it was also this idea that a picture should be perfect.
And you should maybe feel bad if it's not or that you're less if you're not able to take stunning images like that. And ever since we've had these filtering applications, which were incredibly limited, you could make yourself have smoother skin, maybe slimmer jawline. They weren't massive changes.
I think it's become this kind of thing that if we're given a tool now that can effectively run with the limits of our imagination, people are going to do so.
I think if we're talking about this specific conversation, there's definitely a line. If we're saying Photoshop itself, that was a revolutionary technology, right? That was what we're saying kicked all of this off on a digital aspect. Photo manipulation existed for way longer than that, but that was incredibly difficult. That wasn't just technical know-how, that was physical skills.
You had to know how to cut tiny little film rolls and be able to manipulate them in magnified kind of situations, whereas Photoshop enabled you to do that without having to have all the expensive tools. It was still expensive software, but it's gradually become more accessible. I completely agree with you.
There's a difference between almost manipulating something that had substance to begin with and just creating something that is a complete false reality. It never existed to begin with. And there's something to be said about intent there, which is why I think accessibility is one of the more significant concerns. Because even if you wanted to Photoshop an image or something, I don't know, like,