eupolicy.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
This Mastodon server is a friendly and respectful discussion space for people working in areas related to EU policy. When you request to create an account, please tell us something about you.

Server stats:

196
active users

#compute

1 post1 participant0 posts today
Asakiyume<p>Cadwin approached his big sister Dria, who was tapping a single key on her keyboard over and over. He <a href="https://wandering.shop/tags/craned" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>craned</span></a> his neck, trying to see over her shoulder what it was.</p><p>"I want this galumphing computer to divide by zero," she said without turning round. "I'm not buying this 'does not <a href="https://wandering.shop/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a>' bullshit."</p><p>Just then a subsonic hum came from the computer. Fractal flowers filled, then shattered, the monitor. The flowers spread through the room, engulfing the entranced siblings.</p><p><a href="https://wandering.shop/tags/wss366" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>wss366</span></a> <a href="https://wandering.shop/tags/microfiction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>microfiction</span></a></p>
Eric Maugendre<p>Alexandre Roure of the Computer &amp; Communications Industry Association, whose members include many Big Tech groups, says the debate about blunt market access restrictions for non-EU tech companies “only distracts policymakers from the real task: finally delivering a functioning digital single market with clear, simple and practical rules”.</p><p>In private conversations, several Big Tech lobbyists and executives also express confidence in their ability to continue dominating the European market given the paucity of homegrown alternatives and the lack of urgency among many consumers.</p><p>by Barbara Moens for FT: <a href="https://archive.is/20250725082920/https://www.ft.com/content/5e25c397-61d1-4b48-b5c5-65561a4c9df2" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">archive.is/20250725082920/http</span><span class="invisible">s://www.ft.com/content/5e25c397-61d1-4b48-b5c5-65561a4c9df2</span></a> via <span class="h-card" translate="no"><a href="https://fediscience.org/@Ruth_Mottram" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>Ruth_Mottram</span></a></span> 🧵</p><p><a href="https://hachyderm.io/tags/techSovereignty" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techSovereignty</span></a> <a href="https://hachyderm.io/tags/digitalSovereignty" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalSovereignty</span></a> <a href="https://hachyderm.io/tags/decoupling" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>decoupling</span></a> <a href="https://hachyderm.io/tags/EuropeAlternatives" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EuropeAlternatives</span></a> <a href="https://hachyderm.io/tags/nonUS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nonUS</span></a> <a href="https://hachyderm.io/tags/EuropeanAlternatives" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EuropeanAlternatives</span></a> <a href="https://hachyderm.io/tags/OVH" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OVH</span></a> <a href="https://hachyderm.io/tags/Europe" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Europe</span></a> <a href="https://hachyderm.io/tags/EU" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EU</span></a> <a href="https://hachyderm.io/tags/privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>privacy</span></a> <a href="https://hachyderm.io/tags/dataProtection" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dataProtection</span></a> <a href="https://hachyderm.io/tags/OVHCloud" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OVHCloud</span></a> <a href="https://hachyderm.io/tags/CloudComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CloudComputing</span></a> <a href="https://hachyderm.io/tags/BuyEuropean" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>BuyEuropean</span></a> <a href="https://hachyderm.io/tags/EuroStack" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EuroStack</span></a> <a href="https://hachyderm.io/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a> <a href="https://hachyderm.io/tags/bigTech" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>bigTech</span></a> <a href="https://hachyderm.io/tags/GAFAM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GAFAM</span></a> <a href="https://hachyderm.io/tags/Cloud" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Cloud</span></a> <a href="https://hachyderm.io/tags/FT" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FT</span></a></p>
openSUSE Linux<p>Looking to bring the <a href="https://fosstodon.org/tags/cloud" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>cloud</span></a> back to your server room? At <a href="https://fosstodon.org/tags/oSC25" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>oSC25</span></a>, we took a <a href="https://fosstodon.org/tags/deep" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>deep</span></a> look under the hood of <a href="https://fosstodon.org/tags/Harvester" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Harvester</span></a>, an <a href="https://fosstodon.org/tags/opensource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>opensource</span></a> hyperconverged <a href="https://fosstodon.org/tags/infrastructure" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>infrastructure</span></a> platform leveraging <a href="https://fosstodon.org/tags/Longhorn" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Longhorn</span></a> <a href="https://fosstodon.org/tags/storage" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>storage</span></a> to simplify <a href="https://fosstodon.org/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a>, networking &amp; <a href="https://fosstodon.org/tags/storage" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>storage</span></a>. <a href="https://youtu.be/OOLTpwQWspI?si=2InT3vvAccWHwN6B" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">youtu.be/OOLTpwQWspI?si=2InT3v</span><span class="invisible">vAccWHwN6B</span></a></p>
openSUSE Linux<p><a href="https://fosstodon.org/tags/Linux" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Linux</span></a> on the desktop isn’t just a dream anymore. In <a href="https://fosstodon.org/tags/TheGreatMigration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TheGreatMigration</span></a> (Part I &amp; II), explore <a href="https://fosstodon.org/tags/Endof10" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Endof10</span></a> &amp; why now might be the right time for <a href="https://fosstodon.org/tags/software" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>software</span></a> <a href="https://fosstodon.org/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a> changes. <a href="https://fosstodon.org/tags/oSC25" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>oSC25</span></a> <a href="https://fosstodon.org/tags/Linux" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Linux</span></a> <a href="https://fosstodon.org/tags/openSUSE" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>openSUSE</span></a> <a href="https://youtu.be/2HHle_OmqRE?si=TXfylAA2TBU4sS22" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">youtu.be/2HHle_OmqRE?si=TXfylA</span><span class="invisible">A2TBU4sS22</span></a></p>
JesseBot<p>:boost_ok: Can anyone point me to a webhosting company in South Africa that is not big tech?</p><p>No AWS, Azure, GCP, for instance. I'd prefer to avoid anything involved in apartheid.</p><p><a href="https://social.smallhack.org/tags/southafrica" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>southafrica</span></a> <a href="https://social.smallhack.org/tags/hosting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>hosting</span></a> <a href="https://social.smallhack.org/tags/webhost" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>webhost</span></a> <a href="https://social.smallhack.org/tags/webhosting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>webhosting</span></a> <a href="https://social.smallhack.org/tags/vps" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>vps</span></a> <a href="https://social.smallhack.org/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a></p>
Toni Aittoniemi<p>There are things that grow exponentially. And yes, humanity is predictably bad at predicting things on an exponential growth trajectory.</p><p>Silicon Valley has made such a myth at this phenomenon, that they’re completely convinced this will play out with <a href="https://mastodon.green/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> as well.<br>Perhaps because it makes them feel smart?</p><p>But there’s no signal that the current models’ performance would keep on scaling exponentially with added <a href="https://mastodon.green/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a>. On the contrary, it seems to be showing diminishing returns already.</p>
Cybersecurity & cyberwarfare<p><b>Intelligenza Artificiale: Implementazione del meccanismo dell’attenzione in Python</b></p><p>Il meccanismo di attenzione è spesso associato all’architettura dei <strong>transformers</strong>, ma era già stato utilizzato nelle <strong>RNN (reti ricorrenti)</strong>.</p><p>Nei task di traduzione automatica (ad esempio, inglese-italiano), quando si vuole prevedere la parola italiana successiva, è necessario che<em> il modello si concentri, o presti attenzione, sulle parole inglesi più importanti nell’input, utili per ottenere una buona traduzione.</em></p><p>Non entrerò nei dettagli delle RNN, ma<strong> l’attenzione ha aiutato questi modelli a mitigare il problema vanishing gradient,</strong> e a catturare più dipendenze a lungo raggio tra le parole.</p><p>A un certo punto, abbiamo capito che l’unica cosa importante era il meccanismo di attenzione e che l’intera architettura RNN era superflua. Quindi, <a href="https://arxiv.org/abs/1706.03762" rel="nofollow noopener" target="_blank">Attention is All You Need!</a><br> </p><p><strong>Self-Attention nei Transformers</strong></p><p><br>L’attenzione classica indica dove le parole della sequenza in output devono porre attenzione rispetto alle parole della sequenza di input. È importante in task del tipo sequence-to-sequence come la traduzione automatica.</p><p>La self-attention è un tipo specifico di attenzione. Opera tra due elementi qualsiasi della stessa sequenza. Fornisce informazioni su quanto siano “correlate” le parole nella stessa frase.</p><p>Per un dato token (o parola) in una sequenza, la self-attention genera un elenco di pesi di attenzione corrispondenti a tutti gli altri token della sequenza. Questo processo viene applicato a ogni token della frase, ottenendo una matrice di pesi di attenzione (come nella figura).</p><p>Questa è l’idea generale, in pratica le cose sono un po’ più complicate perché vogliamo aggiungere molti parametri/pesi nell nostra rete, in modo che il modella abbia più capacità di apprendimento.<br> </p><p><strong>Le rappresentazioni K, V, Q</strong></p><p><br>L’input del nostro modello è una frase come “mi chiamo <a href="https://www.linkedin.com/in/marcello-politi/" rel="nofollow noopener" target="_blank">Marcello Politi</a>”. Con il processo di tokenizzazione, una frase viene convertita in un elenco di numeri come [2, 6, 8, 3, 1].</p><p>Prima di passare la frase al transformer, dobbiamo creare una rappresentazione densa per ogni token.</p><p>Come creare questa rappresentazione? Moltiplichiamo ogni token per una matrice. La matrice viene appresa durante l’addestramento.</p><p>Aggiungiamo ora un po’ di complessità.</p><p>Per ogni token, creiamo 3 vettori invece di uno, che chiamiamo vettori: chiave (K), valore (V) e domanda (Q). (Vedremo più avanti come creare questi 3 vettori).</p><p>Concettualmente questi 3 token hanno un significato particolare:</p><ul><li>La chiave del vettore rappresenta l’informazione principale catturata dal token.</li><li>Il valore del vettore cattura l’informazione completa di un token.</li><li>Il vettore query, è una domanda sulla rilevanza del token per il task corrente.</li></ul><p>L’idea è che ci concentriamo su un particolare token <em>i</em> e vogliamo chiedere qual è l’importanza degli altri token della frase rispetto al token <em>i</em> che stiamo prendendo in considerazione.</p><p>Ciò significa che prendiamo il vettore <em>q_i</em> (poniamo una domanda relativa a <em>i</em>) per il token <em>i</em>, e facciamo alcune operazioni matematiche con tutti gli altri token <em>k_j</em> (<em>j!=</em>i). È come se ci chiedessimo a prima vista quali sono gli altri token della sequenza che sembrano davvero importanti per capire il significato del token <em>i</em>.</p><p>Ma qual’è questa operazione magica?</p><p>Dobbiamo moltiplicare (dot-product) il vettore della query per i vettori delle chiavi e dividere per un fattore di normalizzazione. Questo viene fatto per ogni token <em>k_j</em>.</p><p>In questo modo, otteniamo uno scroe per ogni coppia (<em>q_i, k_j</em>). Trasformiamo questi score in una distribuzione di probabilità applicandovi un’operazione di softmax. Bene, ora abbiamo ottenuto i pesi di attenzione!</p><p>Con i pesi di attenzione, sappiamo qual è l’importanza di ogni token <em>k_j</em> per indistinguere il token <em>i</em>. Quindi ora moltiplichiamo il vettore di valore <em>v_j</em> associato a ogni token per il suo peso e sommiamo i vettori. In questo modo otteniamo il vettore finale <strong>context-aware</strong> del <em>token_i</em>.</p><p>Se stiamo calcolando il vettore denso contestuale del <em>token_1</em>, calcoliamo:</p><p><em>z1 = a11v1 + a12v2 + … + a15*v5</em></p><p>Dove <em>a1j</em> sono i pesi di attenzione del computer e <em>v_j</em> sono i vettori di valori.</p><p>Fatto! Quasi…</p><p>Non ho spiegato come abbiamo ottenuto i vettori k, v e q di ciascun token. Dobbiamo definire alcune matrici w_k, w_v e w_q in modo che quando moltiplichiamo:</p><ul><li>token * w_k -&gt; k</li><li>token * w_q -&gt; q</li><li>token * w_v -&gt; v</li></ul><p>Queste tre matrici sono inizializzate in modo casuale e vengono apprese durante l’addestramento; questo è il motivo per cui abbiamo molti parametri nei modelli moderni come gli LLM.<br> </p><p><strong>Multi-Head Self-Attention (MHSA) nei Transformers </strong></p><p><br>Siamo sicuri che il precedente meccanismo di self-attention sia in grado di catturare tutte le relazioni importanti tra i token (parole) e di creare vettori densi di quei token che abbiano davvero senso?</p><p>In realtà potrebbe non funzionare sempre perfettamente. E se, per mitigare l’errore, si rieseguisse l’intera operazione due volte con nuove matrici w_q, w_k e w_v e si unissero in qualche modo i due vettori densi ottenuti? In questo modo forse una self-attention è riuscita a cogliere qualche relazione e l’altra è riuscita a cogliere qualche altra relazione.</p><p>Ebbene, questo è ciò che accade esattamente in MHSA. Il caso appena discusso contiene due head (teste), perché ha due insiemi di matrici w_q, w_k e w_v. Possiamo avere anche più head: 4, 8, 16, ecc.</p><p>L’unica cosa complicata è che tutte queste teste vengono gestite in parallelo, elaborandole tutte nello stesso calcolo utilizzando i tensori.</p><p>Il modo in cui uniamo i vettori densi di ogni head è semplice, li concateniamo (quindi la dimensione di ogni vettore deve essere più piccola, in modo che quando li concateniamo otteniamo la dimensione originale che volevamo) e passiamo il vettore ottenuto attraverso un’altra matrice imparabile w_o.<br> </p><p><strong>Hands-on</strong></p><p>Supponiamo di avere una frase. Dopo la tokenizzazione, ogni token (o parola) corrisponde a un indice (numero):</p><p>tokenized_sentence = torch.tensor([<br> 2, #my<br> 6, #name<br> 8, #is<br> 3, #marcello<br> 1 #politi<br>])<br>tokenized_sentence</p><p>Prima di passare la frase nel transformer, dobbiamo creare una rappresentazione densa per ciascun token.</p><p>Come creare questa rappresentazione? Moltiplichiamo ogni token per una matrice. Questa matrice viene appresa durante l’addestramento.</p><p>Costruiamo questa matrice, chiamata matrice di embedding.</p><p>torch.manual_seed(0) # set a fixed seed for reproducibility<br>embed = torch.nn.Embedding(10, 16)</p><p>Se moltiplichiamo la nostra frase tokenizzata con la matrice di embedding, otteniamo una rappresentazione densa di dimensione 16 per ogni token</p><p>sentence_embed = embed(tokenized_sentence).detach()<br>sentence_embed</p><p>Per utilizzare il meccanismo di attenzione dobbiamo creare 3 nuove matrici w_q, w_k e w_v. Moltiplicando un token di ingresso per w_q otteniamo il vettore q. Lo stesso vale per w_k e w_v.</p><p>d = sentence_embed.shape[1] # let's base our matrix on a shape (16,16)</p><p>w_key = torch.rand(d,d)<br>w_query = torch.rand(d,d)<br>w_value = torch.rand(d,d)</p><p><strong>Calcolo dei pesi di attenzione</strong></p><p><br>Calcoliamo ora i pesi di attenzione solo per il primo token della frase.</p><p>token1_embed = sentence_embed</p><p>[0]<a href="https://poliverso.org/search?tag=compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a> the tre vector associated to token1 vector : q,k,v<br>key_1 = w_key.matmul(token1_embed)<br>query_1 = w_query.matmul(token1_embed)<br>value_1 = w_value.matmul(token1_embed)</p><p>print("key vector for token1: \n", key_1)<br>print("query vector for token1: \n", query_1)<br>print("value vector for token1: \n", value_1)</p><p>Dobbiamo moltiplicare il vettore query associato al token1 (query_1) con tutte le chiavi degli altri vettori.</p><p>Quindi ora dobbiamo calcolare tutte le chiavi (chiave_2, chiave_2, chiave_4, chiave_5). Ma aspettate, possiamo calcolarle tutte in una sola volta moltiplicando sentence_embed per la matrice w_k.</p><p>keys = sentence_embed.matmul(w_key.T)<br>keys[0] <a href="https://poliverso.org/search?tag=contains" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>contains</span></a> the key vector of the first token and so on</p><p>Facciamo la stessa cosa con i valori</p><p>values = sentence_embed.matmul(w_value.T)<br>values[0] <a href="https://poliverso.org/search?tag=contains" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>contains</span></a> the value vector of the first token and so on</p><p>Calcoliamo la prima parte della formula adesso.</p><p>import torch.nn.functional as F</p><p># the following are the attention weights of the first tokens to all the others<br>a1 = F.softmax(query_1.matmul(keys.T)/d**0.5, dim = 0)<br>a1</p><p>Con i pesi di attenzione sappiamo qual è l’importanza di ciascun token. Quindi ora moltiplichiamo il vettore di valori associato a ogni token per il suo peso.</p><p>Per ottenere il vettore finale del token_1 che includa anche il contesto.</p><p>z1 = a1.matmul(values)<br>z1</p><p>Allo stesso modo, possiamo calcolare i vettori densi consapevoli del contesto di tutti gli altri token. Ora stiamo utilizzando sempre le stesse matrici w_k, w_q, w_v. Diciamo che usiamo una sola head.</p><p>Ma possiamo avere più triplette di matrici, quindi una multi-heads. Ecco perché si chiama multi-head attention.</p><p>I vettori densi di un token in ingresso, dati in input a ciascuna head, vengono poi concatenati e trasformati linearmente per ottenere il vettore denso finale.</p><p>import torch<br>import torch.nn as nn<br>import torch.nn.functional as F</p><p>torch.manual_seed(0) #</p><p># Tokenized sentence (same as yours)<br>tokenized_sentence = torch.tensor([2, 6, 8, 3, 1]) # [my, name, is, marcello, politi]</p><p># Embedding layer: vocab size = 10, embedding dim = 16<br>embed = nn.Embedding(10, 16)<br>sentence_embed = embed(tokenized_sentence).detach() # Shape: [5, 16] (seq_len, embed_dim)</p><p>d = sentence_embed.shape[1] # embed dimension 16<br>h = 4 # Number of heads<br>d_k = d // h # Dimension per head (16 / 4 = 4)</p><p># Define weight matrices for each head<br>w_query = torch.rand(h, d, d_k) # Shape: [4, 16, 4] (one d x d_k matrix per head)<br>w_key = torch.rand(h, d, d_k) # Shape: [4, 16, 4]<br>w_value = torch.rand(h, d, d_k) # Shape: [4, 16, 4]<br>w_output = torch.rand(d, d) # Final linear layer: [16, 16]</p><p># Compute Q, K, V for all tokens and all heads<br># sentence_embed: [5, 16] -&gt; Q: [4, 5, 4] (h, seq_len, d_k)<br>queries = torch.einsum('sd,hde-&gt;hse', sentence_embed, w_query) # h heads, seq_len tokens, d dim<br>keys = torch.einsum('sd,hde-&gt;hse', sentence_embed, w_key) # h heads, seq_len tokens, d dim<br>values = torch.einsum('sd,hde-&gt;hse', sentence_embed, w_value) # h heads, seq_len tokens, d dim</p><p># Compute attention scores<br>scores = torch.einsum('hse,hek-&gt;hsk', queries, keys.transpose(-2, -1)) / (d_k ** 0.5) # [4, 5, 5]<br>attention_weights = F.softmax(scores, dim=-1) # [4, 5, 5]</p><p># Apply attention weights<br>head_outputs = torch.einsum('hij,hjk-&gt;hik', attention_weights, values) # [4, 5, 4]<br>head_outputs.shape</p><p># Concatenate heads<br>concat_heads = head_outputs.permute(1, 0, 2).reshape(sentence_embed.shape[0], -1) # [5, 16]<br>concat_heads.shape</p><p>multihead_output = concat_heads.matmul(w_output) # [5, 16] @ [16, 16] -&gt; [5, 16]<br>print("Multi-head attention output for token1:\n", multihead_output[0])</p><p><strong>Conclusioni</strong></p><p><br>In questo post ho implementato una versione semplice del meccanismo di attenzione. Questo non è il modo in cui viene realmente implementato nei framework moderni, ma il mio scopo è quello di fornire alcuni spunti per permettere a chiunque di capire come funziona. Nei prossimi articoli analizzerò l’intera implementazione di un’architettura transformer.</p><p>L'articolo <a href="https://www.redhotcyber.com/post/implementazione-del-meccanismo-dellattenzione-in-python/" rel="nofollow noopener" target="_blank">Intelligenza Artificiale: Implementazione del meccanismo dell’attenzione in Python</a> proviene da <a href="https://www.redhotcyber.com/feed" rel="nofollow noopener" target="_blank">il blog della sicurezza informatica</a>.</p>
☮ ♥ ♬ 🧑‍💻<p>Day 19 cont ☢️🛢️🏭🏦🏢🏢🏢💰💰</p><p>“He (<a href="https://ioc.exchange/tags/PeterDutton" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PeterDutton</span></a>) cites <a href="https://ioc.exchange/tags/DataCentres" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DataCentres</span></a> in the US where those <a href="https://ioc.exchange/tags/tech" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>tech</span></a> companies are having conversations with nuclear power providers:</p><p>The beauty of an <a href="https://ioc.exchange/tags/investment" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>investment</span></a> like <a href="https://ioc.exchange/tags/nuclear" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nuclear</span></a> into the <a href="https://ioc.exchange/tags/Hunter" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Hunter</span></a> region for example is you can attract the data centres which is exactly what is happening in the US. <a href="https://ioc.exchange/tags/Apple" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Apple</span></a> and <a href="https://ioc.exchange/tags/Oracle" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Oracle</span></a> and <a href="https://ioc.exchange/tags/Microsoft" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Microsoft</span></a>, or these <a href="https://ioc.exchange/tags/companies" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>companies</span></a> are willing to spend tens of billions of dollars but they are only having conversations with <a href="https://ioc.exchange/tags/NuclearPower" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NuclearPower</span></a> providers.”</p><p><a href="https://ioc.exchange/tags/Straya" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Straya</span></a> gov cant <a href="https://ioc.exchange/tags/science" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>science</span></a> or <a href="https://ioc.exchange/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a>, the LNP are garbage at business. Nuclear generation is <a href="https://ioc.exchange/tags/toxic" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>toxic</span></a>. <a href="https://ioc.exchange/tags/Multinationals" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Multinationals</span></a> avoid tax.</p><p><a href="https://ioc.exchange/tags/AusPol" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AusPol</span></a> / <a href="https://ioc.exchange/tags/LNP" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LNP</span></a> / <a href="https://ioc.exchange/tags/Iberal" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Iberal</span></a> / <a href="https://ioc.exchange/tags/Nationals" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Nationals</span></a> / <a href="https://ioc.exchange/tags/Business" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Business</span></a> / <a href="https://ioc.exchange/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> / <a href="https://ioc.exchange/tags/ArtificialIntelligence" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ArtificialIntelligence</span></a> &lt;<a href="https://www.theguardian.com/australia-news/live/2025/apr/17/australia-election-2025-live-peter-dutton-anthony-albanese-coalition-labor-income-tax-cost-of-living-leaders-debate-ntwnfb?page=with%3Ablock-68006d1c8f08bcf9ff4832be#block-68006d1c8f08bcf9ff4832be" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">theguardian.com/australia-news</span><span class="invisible">/live/2025/apr/17/australia-election-2025-live-peter-dutton-anthony-albanese-coalition-labor-income-tax-cost-of-living-leaders-debate-ntwnfb?page=with%3Ablock-68006d1c8f08bcf9ff4832be#block-68006d1c8f08bcf9ff4832be</span></a>&gt;</p>
☮ ♥ ♬ 🧑‍💻<p>“Elon Musk said on Friday (Saturday AEDT) that his <a href="https://ioc.exchange/tags/xAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>xAI</span></a> has acquired X, the social media app formerly known as <a href="https://ioc.exchange/tags/Twitter" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Twitter</span></a>, in an all-stock transaction for $US45 billion ($71.5 billion), including debt.</p><p>xAI and X’s futures are intertwined. Today, we officially take the step to combine the <a href="https://ioc.exchange/tags/data" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>data</span></a>, <a href="https://ioc.exchange/tags/models" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>models</span></a>, <a href="https://ioc.exchange/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a>, <a href="https://ioc.exchange/tags/distribution" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>distribution</span></a> and <a href="https://ioc.exchange/tags/talent" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>talent</span></a>,” <a href="https://ioc.exchange/tags/Musk" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Musk</span></a> said in a post on X, adding that the combined company would be valued at $US80 billion.”</p><p><a href="https://ioc.exchange/tags/business" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>business</span></a> / <a href="https://ioc.exchange/tags/acquisitions" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>acquisitions</span></a> / <a href="https://ioc.exchange/tags/CreativeAccounting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CreativeAccounting</span></a> &lt;<a href="https://archive.md/in6TN" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">archive.md/in6TN</span><span class="invisible"></span></a>&gt; / &lt;<a href="https://www.afr.com/technology/musk-s-xai-buys-social-media-platform-x-for-71-5b-20250329-p5lnh9" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">afr.com/technology/musk-s-xai-</span><span class="invisible">buys-social-media-platform-x-for-71-5b-20250329-p5lnh9</span></a>&gt; (paywall)</p>
Toni Aittoniemi<p>Putin is betting on <a href="https://mastodon.green/tags/crypto" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>crypto</span></a>, <a href="https://mastodon.green/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a> and <a href="https://mastodon.green/tags/energy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>energy</span></a> to dominate the world in 100 years.<br>They’re investing in chinese GPU’s &amp; miners.<br>Siberia has great cooling and endless gas.<br>The crypto libertarians are not your friend. The Russian Empire is theirs though.<br>Putin lays out his plan of using them to finance his next imperial war.<br><a href="https://mastodon.green/tags/nafo" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nafo</span></a><br><a href="https://youtube.com/watch?v=Bh5O-cRGmJM" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">youtube.com/watch?v=Bh5O-cRGmJ</span><span class="invisible">M</span></a></p>
Kevin Karhan :verified:<p><span class="h-card" translate="no"><a href="https://norden.social/@duco" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>duco</span></a></span> <span class="h-card" translate="no"><a href="https://piratenpartei.social/profile/alexis_roussel" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>alexis_roussel</span></a></span> what makes <a href="https://infosec.space/tags/Bitcoin" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Bitcoin</span></a> the real <a href="https://infosec.space/tags/shitcoin" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>shitcoin</span></a> (along with basically everything that isn't <a href="https://infosec.space/tags/Monero" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Monero</span></a>) is that it's not only <a href="https://infosec.space/tags/PoW" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PoW</span></a> (<a href="https://infosec.space/tags/ProofOfWork" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ProofOfWork</span></a>) but also basically <a href="https://infosec.space/tags/ASIC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ASIC</span></a>-based, so that means literal metric tons of <a href="https://infosec.space/tags/eWaste" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eWaste</span></a> get created every year producing those.</p><ul><li>And since ASICs ain't like <a href="https://infosec.space/tags/CPU" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CPU</span></a>|s, <a href="https://infosec.space/tags/GPU" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GPU</span></a>|s &amp; <a href="https://infosec.space/tags/FPGA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FPGA</span></a>|s, one can't even <a href="https://infosec.space/tags/reuse" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>reuse</span></a> &amp; <a href="https://infosec.space/tags/upcycle" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>upcycle</span></a> them for different workloads (like <a href="https://infosec.space/tags/Rendering" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Rendering</span></a> and general-purpose <a href="https://infosec.space/tags/Compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Compute</span></a>) as they are absolutely inflexible in their use-case.</li></ul>
Kevin Karhan :verified:<p><span class="h-card" translate="no"><a href="https://mastodon.monoceros.co.za/@uastronomer" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>uastronomer</span></a></span> it's something I.did implement in the past (abeit <a href="https://infosec.space/tags/KVM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>KVM</span></a> + <a href="https://infosec.space/tags/Proxmox" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Proxmox</span></a>, but the steps are similar enough):</p><p>You can seperate <a href="https://infosec.space/tags/Storage" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Storage</span></a> and <a href="https://infosec.space/tags/Compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Compute</span></a> given you have a Storage-LAN that is fast enough (and does at least 9k if not 64k Jumbo Frames) and have the <em>"Compute Nodes"</em> entirely <a href="https://infosec.space/tags/diskless" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>diskless</span></a> (booting via <a href="https://infosec.space/tags/iPXE" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>iPXE</span></a> from the <a href="https://infosec.space/tags/SAN" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SAN</span></a>) and then mount the storage via <a href="https://infosec.space/tags/iSCSI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>iSCSI</span></a> or <a href="https://infosec.space/tags/Ceph" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ceph</span></a>.</p><ul><li>Basically it allows you to scale Compute and Storage independently from each other as they are transparent layers and not be confined to limits of a single chassis &amp; it's I/O options...</li></ul><p>Did a bigger project (easily 8-digits in hardware, as per MSRP) where a Employer/Client did <a href="https://infosec.space/tags/CloudExit" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CloudExit</span></a> amidst escalating costs and <a href="https://infosec.space/tags/ROI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ROI</span></a> being within quarters (if not months at the predicted growth rate)...</p>
Eric Maugendre<p>Software As A Service 'is one of the most dominant business models in tech, because it fits both the customer profile of "not wanting to run a bunch of infrastructure" and the tech industry’s love of trapping people in distinct ecosystems. […] The inevitable sprawl of letting SaaS into your organization means that you’re stuck with them.'</p><p>Why you suffer from your business <a href="https://social.coop/tags/software" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>software</span></a>: <a href="https://www.wheresyoured.at/saaspocalypse-now/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">wheresyoured.at/saaspocalypse-</span><span class="invisible">now/</span></a></p><p><a href="https://social.coop/tags/bubble" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>bubble</span></a> <a href="https://social.coop/tags/saas" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>saas</span></a> <a href="https://social.coop/tags/BigTech" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>BigTech</span></a> <a href="https://social.coop/tags/Cloud" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Cloud</span></a> <a href="https://social.coop/tags/AWS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AWS</span></a> <a href="https://social.coop/tags/strategy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>strategy</span></a> <a href="https://social.coop/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://social.coop/tags/Microsoft" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Microsoft</span></a> <a href="https://social.coop/tags/AIBubble" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIBubble</span></a> <a href="https://social.coop/tags/chatBots" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>chatBots</span></a> <a href="https://social.coop/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a></p>
Toni Aittoniemi<p><span class="h-card" translate="no"><a href="https://ec.social-network.europa.eu/@REA" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>REA</span></a></span> I would love for the <a href="https://mastodon.green/tags/EU" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EU</span></a> commission to note, that if Europe does not setup it’s own <a href="https://mastodon.green/tags/cloud" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>cloud</span></a> <a href="https://mastodon.green/tags/infrastructure" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>infrastructure</span></a> very soon, we will end up paying rent to either of the two big players: US or China.</p><p>You should understand that if they own the computing power, they can at any time change the rules on the kinds of workloads that run on them and how they benefit from it.</p><p>For Europe to remain soveign in the <a href="https://mastodon.green/tags/compute" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>compute</span></a> -defined future, we need to be sovereign in the cloud.</p>

AI is about compute, data and algorithms. This paper looks at the global compute available in the public cloud. Interesting paper.
"Compute North vs. Compute South: The Uneven Possibilities of Compute-based AI Governance Around the Globe"
osf.io/preprints/socarxiv/8yp7

osf.ioOSF

I learned that one of the advantages of cloudcomputing is the more efficient use of resources. On a shared infrastructure you don't need the maximum amount of compute and memory all the time so you can share. In this blog they looked at how efficient kubernetes clusters worked. The outcomes are terrible. Massive overprovisioning of compute and memory.
nextplatform.com/2024/03/04/ku
#cloudcomputing #kubernetes #compute #memory #efficiency

The Next Platform · Kubernetes Clusters Have Massive Overprovisioning Of Compute And MemoryIn the ten years since Google released Kubernetes to the open source community, it has become the dominant platform for orchestrating and managing