I mean, absolutely. But you can't just rail at market forces and cultural values forces it's more complex than that. It's not the companies or the artists fault that there is a gap between cost, production, and product. Having done gig editing work, it makes sense to me that AI can do social media and blog posts with more accuracy and speed than I. I don't begrudge losses in marketshare based on that, I begrudge it for thr same reason I always do--the general complicity with the values that undergird our culture. Markets get upset by new technologies, wars, etc. That is natural. Artists have already been priced into their cubby in society and AI accelerates what is already a trend.
Have you interviewed Artists who make money from their art? I mean you'd need to gather a solid dataset for that to be actually interesting.
I certainly understand your position, but I'm not sure what you're responding to here? The point of this piece is to underscore the extent to which the language we use to refer to the corpus (another metaphor, but less atomizing) on which generative AI is trained has an impact in how we think about it and how we do or don't respond to it. And I'm not sure about the final question and what that would reveal that would challenge the point I'm making? Not picking a fight, just wondering how it's connected to what I wrote. :)
I mean, absolutely. But you can't just rail at market forces and cultural values forces it's more complex than that. It's not the companies or the artists fault that there is a gap between cost, production, and product. Having done gig editing work, it makes sense to me that AI can do social media and blog posts with more accuracy and speed than I. I don't begrudge losses in marketshare based on that, I begrudge it for thr same reason I always do--the general complicity with the values that undergird our culture. Markets get upset by new technologies, wars, etc. That is natural. Artists have already been priced into their cubby in society and AI accelerates what is already a trend.
Have you interviewed Artists who make money from their art? I mean you'd need to gather a solid dataset for that to be actually interesting.
I certainly understand your position, but I'm not sure what you're responding to here? The point of this piece is to underscore the extent to which the language we use to refer to the corpus (another metaphor, but less atomizing) on which generative AI is trained has an impact in how we think about it and how we do or don't respond to it. And I'm not sure about the final question and what that would reveal that would challenge the point I'm making? Not picking a fight, just wondering how it's connected to what I wrote. :)