Can open source save AI? – News Block

Excuse the clickbait headline, but isn’t everything we write these days done to push some algorithm, somewhere? It just so happens that I just attended a very interesting event; and it was, rather topically, about open source and AI. But am I writing about it just because it was interesting and I wanted to share some thoughts? Or is it just about the SEO, plus some behavioral psychology tricks I need to apply to ensure measurable clicks, which increases the rankings of social sites, and actually looks good on aggregated internal boards? It’s as if our robot overlords have already won, and all we have left to do is welcome them back.

But I digress. Getting back to our sheep (as they say in French, and I’ll come back to the question), there was a lot to learn from the release of the latest OpenUK research into the economic impact of open source software (OSS) on UK industry and, in more general terms. , its GVA – Gross Value Added. OpenUK is a relatively recent national industry body, formed directly to “move open technologies, not just OSS but also open data, open standards and open innovation, onto the UK radar,” according to its CEO and keynote speaker, Amanda Brock.

OpenUK’s public purpose is to develop UK leadership and global collaboration in open technology, which essentially means stimulating symbiosis between UK organizations and open technology. Power to the OpenUK elbow, that’s what I’m saying – I recommend interested parties take a look at the research (led by Research Director Dr Jennifer Barth) and act on its findings. In a nutshell, OSS brings over £13bn of value to the UK, representing 27% of UK Tech’s contribution and sees plans to invest an amount of £327m. By my calculations, that’s roughly a 41x planned ROI.

I know it’s not as simple as that, as the spend goes to a global group of developers, innovators, vendors and others. However, and Amanda made this point, many solutions built on top of OSS end up being based in the US, including UK-founded companies like Weaveworks (for GitOps) and Snyk (Development Security). UK investors are traditionally reticent compared to those in the Bay Area and need a clearer understanding of what OSS results in. Conversely, OSS creates more opportunities for skill development and new business creation, furthering the goals of our multi-island nation on the global stage.

The fly in the Jeff Goldblum-sized ointment is AI, which seemingly came out of nowhere to be this year’s hot topic. Not entirely true, we’ve heard a lot about AI lately, but it seemed to be going the same way as 3D TVs, before Midjourney and ChatGPT came along. It is not ironic that this lands smack dab in the middle of the OpenUK research cycle (which had to generate a second research report midway) and UK AI legislation (which had to be rewritten in flight to take into account the large-scale models).

AI is an important area for the world of open technology, first in terms of software (the most widely used AI platform, TensorFlow, is open source), but then also for data. Wikipedia was founded on open principles, both using open source and publishing its open data on an open content platform, so it was no coincidence that its founder, Jimmy Wales, was present. Recent developments in generative AI relate directly to the availability of open data sources: “50% of the input for ChatGPT is Wikipedia,” says Jimmy, who agrees with this. That’s what it’s for.

So to the question, can the openness save the AI? The answer is no, not on its own, but it can go some way towards providing the tools we need to deliver it, in a way that will benefit society in general (and therefore the UK in particular), by putting technology in the hands of the various. One reason is that, like the OSS, the AI ​​genie came out of the bottle. “We can’t assume there are six companies we can regulate,” Jimmy says, pointing to the millions of hobbyist developers who are already playing with Midjourney via Discord, or writing their own versions of generative AI software. AI can learn from the world of OSS, the power of individual responsibility: we can’t blame the tools, but we can legislate against what people are creating, he suggests. “You can always use Photoshop to create an image; It just wouldn’t look very real, now it will look more real.”

That is not to say that we dispense with general legislation at the corporate and national level, but this should be directed at the consequences of AI, rather than its inevitable and more general use. “The only thing that is inevitable is that governments will regulate; if that is too vertical, it will be too difficult. But the opposite approach, individual responsibility with the right level of government, bottom-up and principled, that’s the best approach,” says Amanda. As Chris Yiu, Director of Public Policy at Meta, pointed out, this goes hand in hand with the transparency and openness that are (the clue is in the name) the pillars of OSS. If the AI ​​genius has spawned many little geniuses, we can use them as a peer network to create a stronger result.

I can agree, as long as accountability and openness are applied at all stages of the delivery cycle – there’s a lot to break down about “the right level of governance” in data collection and management, cybersecurity and access management , best procedural and jurisdictional practices. questions (what is legal in one country may not be in another and may be unethical in both). For example, if I could use data from Strava’s open API to create a picture of people who might have medical problems and then publish it, who would be responsible? Or if I created the code and left it lying around?

It amazes me that the post-Brexit UK is in a unique position to set a different agenda than the EU, which seeks top-down regulation, or the US, which has a habit of playing games. a bit faster with privacy. than we would like. At that point, organizations like OpenUK could find their work cut back: it’s one thing to advocate for greater acceptance of OSS, but structurally another to find yourself as the most important people in a newly created but critical space. That’s a good problem, but one not to be taken lightly.

We have time to get it right. No one in the room felt that the AI ​​was a runaway train: while examples of AI-driven challenges do exist, they are still the exception rather than the norm (Chris Yiu said, “We’re a long way from anything that comes close to to superintelligence”). , we already need independent organizations to obtain this material to advise on the best way forward, working with policy makers. Perhaps open source models and the open method for creating new ones can counter the worst potential quirks of AI; and right now, we need all the help we can get as we develop a new understanding of the impact of the information age, both in the UK and beyond.

At that point, we can keep our robots where they belong, much to a sigh of relief for even the most fearful of the AI-embracing future.

Leave a Comment

Your email address will not be published. Required fields are marked *

Exit mobile version