Oracle has open-sourced GraphPipe to enhance machine learning applications. The project’s goal is to improve deployment results for machine learning models, noted Project Leader Vish Abrams. That process includes creating an open standard. The company has a questionable relationship with open source developers, so its decision to open-source GraphPipe might not receive a flood of interest.
In spite of all the high-profile breaches that seem to sweep the headlines with greater frequency, companies slowly but surely have been getting a handle on internal security practices. At this point, it’s hard to imagine any employee, in or out of the tech sector, who hasn’t been run through antiphishing training. However, security is only as strong as its weakest link.
Google plans to open its Maps APIs to video game developers, which could result in far more realistic settings in augmented reality games. With access to real-time map updates and rich location data, developers will have many choices of settings for their games. The APIs will provide devs with what Google has described as a “living model of the world” to use as a foundation for game worlds.
The Raspberry Pi Foundation on Wednesday launched the Raspberry Pi 3 Model B+. The new release comes two years after the introduction of its predecessor, the Raspberry Pi 3 Model B. The Raspberry Pi computer runs the open source Raspbian operating system. The Raspberry Pi 3 Model B+ is an incremental upgrade to a line of predecessors that have become entrenched in education, hobbyist and industry markets.
Microsoft has announced the first major upgrade to its Quantum Development Kit since its introduction last year. It has added several new features designed to open the platform to a wider array of developers, including support for Linux and macOS, as well as additional open source libraries. Further, the kit will be interoperable with the Python computing language.
We often talk about hybrid cloud business models, but virtually always in the context of traditional processor-bound applications. What if deep learning developers and service operators could run their GPU-accelerated model training or inference delivery service anywhere they wanted? What if they could do so without having to worry about which Nvidia graphics processor unit they were using?