Security updates have been issued by Debian (org-mode), Oracle (shim and tigervnc), Red Hat (ansible-core, avahi, buildah, container-tools:4.0, containernetworking-plugins, edk2, exfatprogs, fence-agents, file, freeglut, freerdp, frr, grub2, gstreamer1-plugins-bad-free, gstreamer1-plugins-base, gstreamer1-plugins-good, harfbuzz, httpd, ipa, kernel, libjpeg-turbo, libnbd, LibRaw, libsndfile, libssh, libtiff, libvirt, libX11, libXpm, mingw components, mingw-glib2, mingw-pixman, mod_http2, mod_jk and mod_proxy_cluster, motif, mutt, openssl and openssl-fips-provider, osbuild and osbuild-composer, pam, pcp, pcs, perl, pmix, podman, python-jinja2, python3.11, python3.11-cryptography, python3.11-urllib3, qemu-kvm, qt5-qtbase, runc, skopeo, squashfs-tools, systemd, tcpdump, tigervnc, toolbox, traceroute, webkit2gtk3, wpa_supplicant, xorg-x11-server, xorg-x11-server-Xwayland, and zziplib), SUSE (docker, ffmpeg, ffmpeg-4, frr, and kernel), and Ubuntu (anope, freerdp3, and php7.0, php7.2, php7.4, php8.1).
An anonymous reader quotes a report from TechCrunch: Ahead of its annual GitHub Universe conference in San Francisco early this fall, GitHub announced Copilot Workspace, a dev environment that taps what GitHub describes as "Copilot-powered agents" to help developers brainstorm, plan, build, test and run code in natural language. Jonathan Carter, head of GitHub Next, GitHub's software R&D team, pitches Workspace as somewhat of an evolution of GitHub's AI-powered coding assistant Copilot into a more general tool, building on recently introduced capabilities like Copilot Chat, which lets developers ask questions about code in natural language. "Through research, we found that, for many tasks, the biggest point of friction for developers was in getting started, and in particular knowing how to approach a [coding] problem, knowing which files to edit and knowing how to consider multiple solutions and their trade-offs," Carter said. "So we wanted to build an AI assistant that could meet developers at the inception of an idea or task, reduce the activation energy needed to begin and then collaborate with them on making the necessary edits across the entire corebase."
Given a GitHub repo or a specific bug within a repo, Workspace -- underpinned by OpenAI's GPT-4 Turbo model -- can build a plan to (attempt to) squash the bug or implement a new feature, drawing on an understanding of the repo's comments, issue replies and larger codebase. Developers get suggested code for the bug fix or new feature, along with a list of the things they need to validate and test that code, plus controls to edit, save, refactor or undo it. The suggested code can be run directly in Workspace and shared among team members via an external link. Those team members, once in Workspace, can refine and tinker with the code as they see fit.
Perhaps the most obvious way to launch Workspace is from the new "Open in Workspace" button to the left of issues and pull requests in GitHub repos. Clicking on it opens a field to describe the software engineering task to be completed in natural language, like, "Add documentation for the changes in this pull request," which, once submitted, gets added to a list of "sessions" within the new dedicated Workspace view. Workspace executes requests systematically step by step, creating a specification, generating a plan and then implementing that plan. Developers can dive into any of these steps to get a granular view of the suggested code and changes and delete, re-run or re-order the steps as necessary. "Since developers spend a lot of their time working on [coding issues], we believe we can help empower developers every day through a 'thought partnership' with AI," Carter said. "You can think of Copilot Workspace as a companion experience and dev environment that complements existing tools and workflows and enables simplifying a class of developer tasks ... We believe there's a lot of value that can be delivered in an AI-native developer environment that isn't constrained by existing workflows."
Read more of this story at Slashdot.
Jules Roscoe reports via 404 Media: Russia has replaced Wikipedia with a state-sponsored encyclopedia that is a clone of the original Russian Wikipedia but which conveniently has been edited to omit things that could cast the Russian government in poor light. Real Russian Wikipedia editors used to refer to the real Wikipedia as Ruwiki; the new one is called Ruviki, has "ruwiki" in its url, and has copied all Russian-language Wikipedia articles and strictly edited them to comply with Russian laws. The new articles exclude mentions of "foreign agents," the Russian government's designation for any person or entity which expresses opinions about the government and is supported, financially or otherwise, by an outside nation. [...]
Wikimedia RU, the Russian-language chapter of the non-profit that runs Wikipedia, was forced to shut down in late 2023 amid political pressure due to the Ukraine war. Vladimir Medeyko, the former head of the chapter who now runs Ruviki, told Novaya Gazeta Europe in July that he believed Wikipedia had problems with "reliability and neutrality." Medeyko first announced the project to copy and censor the 1.9 million Russian-language Wikipedia articles in June. The goal, he said at the time, was to edit them so that the information would be "trustworthy" as a source for all Russian users. Independent outlet Bumaga reported in August that around 110 articles about the war in Ukraine were missing in full, while others were severely edited. Ruviki also excludes articles about reports of torture in prisons and scandals of Russian government representatives. [...]
Graphic designer Constantine Konovalov calculated the number of characters changed between Wikipedia RU and Ruviki articles on the same topics, and found that there were 205,000 changes in articles about freedom of speech; 158,000 changes in articles about human rights; 96,000 changes in articles about political prisoners; and 71,000 changes in articles about censorship in Russia. He wrote in a post on X that the censorship was "straight out of a 1984 novel." Interestingly, the Ruviki article about George Orwell's 1984 entirely omits the Ministry of Truth, which is the novel's main propaganda outlet concerned with governing "truth" in the country.
Read more of this story at Slashdot.
ReneR writes: A major T2 Linux milestone has been released, shipping with full support for 25 CPU architectures and several C libraries, as well as restored support for Intel IA-64 Itanium. Additionally, many vintage X.org DDX drivers were fixed and tested to work again, as well as complete support for the latest KDE 6 and GNOME 46.
T2 is known for its sophisticated cross compile support and support for nearly all existing CPU architectures: Alpha, Arc, ARM(64), Avr32, HPPA(64), IA64, M68k, MIPS(64), Nios2, PowerPC(64)(le), RISCV(64), s390x, SPARC(64), and SuperH x86(64). T2 is an increasingly popular choice for embedded systems and virtualization. It also still supports the Sony PS3, Sgi, Sun and HP workstations, as well as the latest ARM64 and RISCV64 architectures.
The release contains a total of 5,140 changesets, including approximately 5,314 package updates, 564 issues fixed, 317 packages or features added and 163 removed, and around 53 improvements. Usually most packages are up-to-date, including Linux 6.8, GCC 13, LLVM/Clang 18, as well as the latest version of X.org, Mesa, Firefox, Rust, KDE 6 and GNOME 46!
More information, source and binary distribution are open source and free at T2 SDE.
Read more of this story at Slashdot.
Ahead of its annual I/O developer conference in May, Google has decided to lay off staff across key teams like Flutter, Dart, Python and others. "As we've said, we're responsibly investing in our company's biggest priorities and the significant opportunities ahead," said a Google spokesperson. "To best position us for these opportunities, throughout the second half of 2023 and into 2024, a number of our teams made changes to become more efficient and work better, remove layers, and align their resources to their biggest product priorities. Through this, we're simplifying our structures to give employees more opportunity to work on our most innovative and important advances and our biggest company priorities, while reducing bureaucracy and layers." TechCrunch reports: The company clarified that the layoffs were not company-wide but were reorgs that are part of the normal course of business. Affected employees will be able to apply for other open roles at Google, we're told. [...] Though Google didn't detail headcount, some of the layoffs at Google may have been confirmed in a WARN notice filed on April 24. WARN, or the California Worker Adjustment and Retraining Notification Act, requires employers with more than 100 employees to provide 60-day notice in advance of layoffs. In the filing, Google said it was laying off a total of 50 employees across three locations in Sunnyvale.
On social media, commenters raised concerns with the Python layoffs in particular, given the role that Python tooling plays in AI. But others pointed out that Google didn't eliminate its Python team; it replaced that team with another group based in Munich -- at least according to Python Steering Council member Thomas Wouters in a post on Mastodon last Thursday.
Read more of this story at Slashdot.
If 2023 was the tech industry's year of the A.I. chatbot, 2024 is turning out to be the year of A.I. plumbing. From a report: It may not sound as exciting, but tens of billions of dollars are quickly being spent on behind-the-scenes technology for the industry's A.I. boom. Companies from Amazon to Meta are revamping their data centers to support artificial intelligence. They are investing in huge new facilities, while even places like Saudi Arabia are racing to build supercomputers to handle A.I. Nearly everyone with a foot in tech or giant piles of money, it seems, is jumping into a spending frenzy that some believe could last for years.
Microsoft, Meta, and Google's parent company, Alphabet, disclosed this week that they had spent more than $32 billion combined on data centers and other capital expenses in just the first three months of the year. The companies all said in calls with investors that they had no plans to slow down their A.I. spending. In the clearest sign of how A.I. has become a story about building a massive technology infrastructure, Meta said on Wednesday that it needed to spend billions more on the chips and data centers for A.I. than it had previously signaled. "I think it makes sense to go for it, and we're going to," Mark Zuckerberg, Meta's chief executive, said in a call with investors.
The eye-popping spending reflects an old parable in Silicon Valley: The people who made the biggest fortunes in California's gold rush weren't the miners -- they were the people selling the shovels. No doubt Nvidia, whose chip sales have more than tripled over the last year, is the most obvious A.I. winner. The money being thrown at technology to support artificial intelligence is also a reminder of spending patterns of the dot-com boom of the 1990s. For all of the excitement around web browsers and newfangled e-commerce websites, the companies making the real money were software giants like Microsoft and Oracle, the chipmaker Intel, and Cisco Systems, which made the gear that connected those new computer networks together. But cloud computing has added a new wrinkle: Since most start-ups and even big companies from other industries contract with cloud computing providers to host their networks, the tech industry's biggest companies are spending big now in hopes of luring customers.
Read more of this story at Slashdot.