eupolicy.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
This Mastodon server is a friendly and respectful discussion space for people working in areas related to EU policy. When you request to create an account, please tell us something about you.

Server stats:

223
active users

#modelmarketplaces

0 posts0 participants0 posts today
Miguel Afonso Caetano<p><a href="https://tldr.nettime.org/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://tldr.nettime.org/tags/ML" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ML</span></a> <a href="https://tldr.nettime.org/tags/ContentModeration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ContentModeration</span></a> <a href="https://tldr.nettime.org/tags/LLMs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLMs</span></a> <a href="https://tldr.nettime.org/tags/ModelMarketplaces" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ModelMarketplaces</span></a> <a href="https://tldr.nettime.org/tags/PlatformGovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PlatformGovernance</span></a>: "The AI development community is increasingly making use of hosting intermediaries, such as Hugging Face, which provide easy access to user-uploaded models and training data. These model marketplaces lower technical deployment barriers for hundreds of thousands of users, yet can be used in numerous potentially harmful and illegal ways. In this article, we explain the ways in which AI systems, which can both ‘contain’ content and be open-ended tools, present one of the trickiest platform governance challenges seen to date. We provide case studies of several incidents across three illustrative platforms – Hugging Face, GitHub and Civitai – to examine how model marketplaces moderate models. Building on this analysis, we outline important (and yet nevertheless limited) practices that industry has been developing to respond to moderation demands: licensing, access and use restrictions, automated content moderation, and open policy development. While the policy challenge at hand is a considerable one, we conclude with some ideas as to how platforms could better mobilise resources to act as a careful, fair, and proportionate regulatory access point."</p><p><a href="https://www.tandfonline.com/doi/full/10.1080/17579961.2024.2388914" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">tandfonline.com/doi/full/10.10</span><span class="invisible">80/17579961.2024.2388914</span></a></p>