Android Pie Is Filled with AI | Operating Systems

Artificial Intelligence plays a big role in Android 9, the latest version of Google’s mobile operating system, released Monday.

Called “Android Pie,” the OS is designed to learn from its users’ behavior, and apply those lessons to simplify and customize their phone experiences.

“From predicting your next task so you can jump right into the action you want to take, to prioritizing battery power for the apps you use most, to helping you disconnect from your phone at the end of the day, Android 9 adapts to your life and the ways you like to use your phone,” noted Sameer Samat, Google’s vice president of product management for Android and Google Play.

google's android 9 pie

Adaptive Brightness and Adaptive Battery are two ways Android Pie uses AI to customize and improve a phone’s performance.

Adaptive Brightness learns what brightness levels a user likes in certain conditions and automatically adjusts the display to those settings when those conditions arise.

Adaptive Battery plugs into Google’s DeepMind systems and can learn a person’s phone usage patterns and make adjustments to optimize power usage.

“Users of the Android P beta program on Google Pixel phones found a 20 percent increase in battery life,” said David McQueen, research director for consumer devices in the London offices of ABI Research, a technology advisory firm.

“Battery life has always been a major pain point for the smartphone user, so this implementation of AI will be welcome relief,” he told TechNewsWorld.

Seeing Will Be Believing

The power management feature works without adding additional hardware, McQueen pointed out.

Huawei introduced performance-enhancing AI in its Mate 10 Pro product, he said, but to do it, the company had to add a chip to the device, which it called a “neural processing unit.”

“There’s not much going on in terms of new battery technology that can lengthen battery life, so Adaptive Battery could be a good thing,” suggested William Stofega, program director for mobile phones and drones at
IDC, a market analysis company based in Framingham, Massachusetts.

The Adaptive Battery feature appears to be compelling, acknowledged Tuong Nguyen,
a senior principal analyst at Gartner, a research and advisory company
based in Stamford, Connecticut. However, he is withholding judgment on the feature until the verdict from users comes in.

“We see a lot of power optimization announcements, and I’m sure they work well enough,” Nguyen told TechNewsWorld, “but my perception as a consumer is that I can never stay sufficiently charged and am always using too much battery.”

Screen Slices

Another new addition to Android is App Actions. It makes connections between when and how you use apps and makes suggestions based on those connections. For example, it’s 5:15 p.m. on a Monday. App Action may ask if you want to open the e-book you’ve been reading on your commute to and from work for the past week.

Google also announced a feature for Android Pie called “Slices,” which won’t appear in the OS until later this fall.

Slices shows relevant information from apps depending on a user’s screen activity. So if a user started typing Lyft into Google Search, Slice would display a slice of the Lyft app with information such as prices to a destination and the ETA for a driver.

“Slices is great because it brings us a step closer to the post-app world,” Nguyen said.

“Instead of searching through a dozen of apps and individually opening them,” he continued, “the UI allows me to use them with fewer steps.”

Better Security

Android Pie also sports a new single home button for simpler navigation.

In addition, Android’s Overview feature has been redesigned to display full screen previews of recently used apps. It also now supports Smart Text Selection, providing action suggestions based on selected text.

Security has been beefed up in Android 9. It has an improved security model for biometrics. It uses a secure, dedicated chip to enable hardware security capabilities that protect sensitive data, such as credit card information.

Android 9 chooses the TLS protocol by default, as well as DNS over TLS, to help protect all Web communications and keep them private.

Multi-Camera and HEIF Support

Android’s photographic capabilities are expanded in Pie. It supports multiple cameras, which enables developers to access streams from a number of physical cameras simultaneously.

“Multi-camera support is a potentially cool feature because it impacts the trajectory of immersive augmented reality, mixed reality and virtual reality experiences,” Nguyen said.

“Anything that advances immersive is exciting for me, but it’s a long road, so don’t expect to see something with a super impact immediately,” he added. “It’s more of a building block for bigger things to come.”

Android Pie also supports a new image format, HEIF. The format provides better compression than the widely used JPEG format without a loss in quality. Apple has been using the format for awhile.

A common complaint among consumers is a lack of storage on phones, Nguyen noted.

“I’m not familiar with the technical details on HEIF, but I think all consumers can appreciate having more room because of better compression,” he said.

Fighting Phone Addiction

With concerns rising about how much time people spend with their phones, Google decided to add some time management features to Android Pie.

“While much of the time we spend on our phones is useful, many of us wish we could disconnect more easily and free up time for other things,” observed Google’s Samat.

“In fact, over 70 percent of people we talked to in our research said they want more help with this,” he added. “So we’ve been working to add key capabilities right into Android to help people achieve the balance with technology they’re looking for. ”

The new “digital well-being” features that will be added to Android Pie this fall include the following:

  • A Dashboard that helps users understand how they’re spending time on their devices;
  • An App Timer that lets an operator set time limits on apps and grays out the icon on their home screen when the time is up;
  • A Do Not Disturb mode, which silences all the visual interruptions that pop up on a screen; and
  • Wind Down, which switches on Night Light and Do Not Disturb and fades the screen to grayscale before bedtime.

While the new digital health features may be embraced by some users, they could be annoying to others.

“I can see things like Wind Down and app timers getting in the way,” IDC’s Stofega told TechNewsWorld. “I thiink people want to use their devices whenever and however they want.”

Possible Pain Points

For many Android users, all the goodies in the latest version of the OS are likely to remain out of their hands for some time, since Pie works only on Pixel models, and a few other phones that participated in the beta program for the software.

“It will be telling how quickly Android P is able to migrate to Samsung and Huawei smartphones, and then on to those that run Android One,” McQueen said.

Even for those who are able to get their hands on the new OS, there could be challenges.

“The issue always is how quickly will people be able to recognize some of these new features,” and whether these devices are “getting too complex for their own good,” Stofega said.

“These devices are becoming Swiss Army knife-like,” he remarked. “Device makers have to figure out and adjust to what people really need versus what’s technically possible.”

John P. Mello Jr. has been an ECT News Network reporter
since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the
Boston Phoenix, Megapixel.Net and Government
Security News
. Email John.

Source

Top 5 Websites to Master Hacking With Kali Linux : For Beginners

Despite writing so many tutorials on hacking with Kali Linux, I often get stuck and have to consult other resources. My blog has a lot of things but since I myself don’t know everything, there’s no way I can provide all the resources that you guys need. Here’s a list of top 5 websites about hacking with Kali Linux that have enough resources to answer 99.9% of your queries, and have enough tutorials to teach you most of the stuff you need to know to get to a level where you can read the more complex stuff on any website.

5. Null Byte

Not dedicated to Kali, but to white hat hacking in general.

Advantages-

  • Lot of high quality content
  • Many content-creator / authors
  • Has a forum as well to ask questions
  • Active comments section

Disadvantages-

  • Not tailored to Kali
  • Navigation not very intuitive. Hard to find out stuff related to Kali.

4. Hacking-Tutorial

Very old website. Most websites in this list are fairly new, and probably didn’t write tutorials for backtrack (Kali is what came after Backtrack 5 r3, instead of backtrack 6). However, this website was around before the time of Backtrack 5 R1 (when the the display manager was not started by default, and we had to use startx command just to get a GUI).

Advantages-

  • Lot of high quality content
  • Mostly articles pertain to Kali Linux or Backtrack (the old posts). The backtrack ones will work with Kali without any changes in commands/procedure.
  • Only one author, so all posts are from one guy, means less low quality posts.

Disadvantages-

  • Design looks cluttered. Too many sharing widgets.
  • Only one author on this website, while it had it’s advantage, there are disadvantages too. There’s only so much good stuff one person can write.

3. Hacking Tutorials

Fairly new, pretty looking and frequently updated website.

Advantages-

  • The homepage is quite good and has posts organized by categories.
  • Content is high quality (I said this for every damn website I posted here, but then they made it to the list because of the high quality content).
  • Has many tutorials that you may not find on other websites (most sites have redundant tutorials, all the sites I’ve mentioned here will have a post about aircrack-ng, etc., but this site has some uncommon attacks and tools covered as well).

Disadvantages-

  • Not well organized content. You might end up reading something whose prerequisites you don’t know and you can’t understand what’s going on in the tutorial.
  • Some very long posts doing stuff no one gives a rats ass about. Basically, lots of unnecessary content. I guess that comes as a side effect of having uncommon tutorials that won’t be found on other websites. There’s a reason why they aren’t on most websites. It’s because no is interested in reading them.

2. Black More Ops

Not limited to Kali Linux, has a lot of hacking tutorials.

Advantages-

  • While it’s not limited to Kali, most of the content is focused on Kali
  • Again, lots of high quality content.
  • Navigation much easier

Disadvantages-

  • Some of the posts have nothing but a Youtube video, with no content to go along with it.
  • Some posts are news articles, instead of tutorials. Many people may not mind this, but some may.

1. Security Tube

This one has been around for ages too. However, it isn’t providing any new free content anymore (but there’s a lot of old content which is golden). The owner is now selling certified courses. (

the creator

is a badass, has written books, sells courses, discovered vulnerabilities and written attacks for WEP, etc.).

Advantages-

  • Gold mine of resources if you’re really interested in getting into hacking. No script kiddie stuff. The owner makes video tutorials and groups them up into megaprimers. Everything I know about wireless hacking I learnt from his wireless hacking megaprimer (the videos are free, the certification, if you want, will cost you).
  • I haven’t stressed this enough, no script kiddie stuff. He actually has two assembly language megaprimers (again, free of cost).

Disadvantages-

  • Megaprimers are in very old operating systems, mostly backtrack 5.
  • Lets be honest, not everyone wants to watch 10 hours of video to learn the intricacies of hacking. Some are happy using tools and not caring about why or how they work. Unless you want to pursue a career in security, knowing how to use the tools would often be enough.

Despite the disadvantages, it’s an awesome website. If you are really a security enthusiast, this is the go to site. All the other sites I’ve mentioned, as well as my blog, can’t give you the in depth knowledge you can get from there.

At last, I would stop pretending that I’m a selfless person who wants every one of his blog’s visitor to go to his competitor websites. The last item on the list, my own blog (yay!)-

Bonus : Kali Tutorials

My hobby blog. I am the main content creator. Over time, many have contributed. In one accident, all content by two of my friends was lost, and I become the sole author again. Recently, another author has joined in, and there’s two of us now, but I still have to do most of the posting.

Advantages-

  • Kali is our (mine really, but ours sounds so cool, almost like I have a team of authors) utmost priority. Only a few tutorials are there that aren’t related to Kali
  • Tried my best to make the navigation nice, and order the posts so that beginners can read them in order of difficulty.

Disadvantages-

  • Homepage sucks
  • I am the only person who replies to comments, and sometimes I can’t reply to all. So many queries go unanswered.
  • Site slow af.
  • Only two authors. The other author focuses on youtube videos so mainly it’s just me. So, not a lot of regular content.

PS: I’m looking for people who are willing to write content, mail me at admin@kalitutorials.net if you’re interested.
PPS: Would love to edit advantages/disadvantages based on comments (especially for my blog, since review of one’s own work is bound to be inaccurate and biased). Let me know how you feel. This article represents my personal views, and it’d be awesome to be able to incorporate a wider perspective on the basis of comments.

Source

AI combat arena ‘Gladiabots’ has enabled Linux support on Steam

No longer hidden behind a beta of Steam, AI combat arena Gladiabots from GFX47 is now officially supported on Steam for Linux. Do note, the game is still in Early Access.

If you love strategy games and feel like you want a little more control over unit AI, this might be the game for you. In Gladiabots you assemble a team of robots, design their AI with a handy drag and drop interface and attempt to beat another AI in battle. There’s decent tutorials, a campaign, cross-platform online play that doesn’t require you to be online at the same time and it’s really quite clever.

While the programming side of it is quite simple to look at, the features it offers can end up quite complex. It’s a system that hides complication behind an easy to understand interface so as to not scare you away. It also offers up a sandbox mode, so you can toy with the AI as much as you want which is very cool. Once you’ve got the hang of it, there’s even regular tournaments to get into too.

There’s certainly an interesting learning curve to it though, the challenges can be a little difficult requiring some creative thinking. I’ve spent quite a lot of time testing, tinkering and attempting to beat it and it becomes a hard game to put down. Getting your AI to do exactly what you want, with a number of conditions feels insanely satisfying.

For a look behind the scenes at what the developer is planning to add and fix, they have a public trello tracker to follow.

You can grab it from itch.io and Steam.

Source

Plane Theme and Icons Gives Your Desktop An Appearance Boost – NoobsLab

Another theme pack with icons for your Linux Desktop. Plane theme is designed to make desktop more elegant and simple, it goes very well along with its own icon pack. Now a days many themes are under development for Gnome and Plane is one of them, it is constantly updating since 2017, fixing and making theme look better. It has some parts from Arc and Adwaita themes, also some other themes inspired author to make Plane more eye catching.

There are two versions in this theme: light version and dark version which gives comfort to your eyes. This pack includes Gnome shell themes as well, which lets you match your Gnome shell with your Gtk theme.

Primarily, this pack targets Gnome Shell desktop but can be used on other desktops as well such as: Cinnamon, Xfce, Mate etc. Icons are designed to use with this theme pack but if you want then you can use them with any theme of your choice. Themes are available for Ubuntu 18.10/18.04 and Linux Mint 19 via our PPA. Icons available for Ubuntu 18.10/18.04/16.04/14.04/Linux Mint 19/18/17. If you find any kind of bug or problem with this theme pack then report it to author and it will get fixed in the next update.

Available for Ubuntu 18.10/18.04 Bionic/Linux Mint 19/and other Ubuntu derivatives
To install Plane themes in Ubuntu/Linux Mint open Terminal (Press Ctrl+Alt+T) and copy the following commands in the Terminal:
Available for Ubuntu 18.10/18.04 Bionic/16.04 Xenial/14.04 Trusty/Linux Mint 19/18/17/and other Ubuntu derivatives
To install Plane icons in Ubuntu/Linux Mint open Terminal (Press Ctrl+Alt+T) and copy the following commands in the Terminal:
That’s it

Source

Innovating Nanotechnology with Open Science and AI

Nanotechnology is evidently a very popular buzzword, comprising of several remarkable applicabilities in Science and Technology. In this new article for Open Science and Artificial Intelligence, we will explore how both of them impact Nanotechnology research.

Trivia: The term “Open Source” was coined by futurist Christine Peterson, who is an American nanotechnologist and also the co-founder of Foresight Institute, which is primarily focused on Nanotechnology research.

What is Nanotechnology, again?

Before understanding what Nanotechnology is, lets first look into the term “nano” (this might remind you of the text editor that is a favourite of many of us Linuxers in the FOSS community!). In both the cases, “nano” simply refers to a scale of measurement. For example, if we want to measure distance in nanometer (nm), a comparable value in meter would be:

1 meter = 1000,000,000 nanometers;

that is, if you take 1 billionth of a meter, what you get is 1 nm. So, it is an extremely small scale of measurement. The video included in this section takes you to that small scale and explains Nanotechnology in a very simple manner.

Nanotechnology is the implementation of different techniques using Science, Engineering and Technology in order to study phenomena at the nanoscale. In general, such studies are carried out in the range of 1-100 nm.

Why is Nanotechnology so significant?

Nanotechnology is of immense significance due to its wide variety of applicabilities in diverse fields such as biology, physics, chemistry, material sciences and many others.

It would be much easier to further understand its significance if we talk about some of the noteworthy applications of Nanotechnology as follows:

1. Healthcare

As we already know that Nanotechnology works at the Nano scale, it means we can work at the molecular and subatomic level. Nanomedicine, for instance, has a created a revolution in the field of drug delivery because nanotechnology enables the therapeutic molecule (contained in the medicine) to lock on to the desired protein right on target after consumption. This is carried out with the help of Nanoparticles.

It is due to Nanotechnology that chemotherapy can now be focused only on the disease affected area, ensuring the whole body need not go through the process of cancer treatment. Thus, the immune system is saved from getting destroyed as chemotherapy involves the use of toxic chemicals to get rid of cancerous regions.

2. Nanorobotics

You might have already heard about Nanobots. As we have discussed just earlier about the term called “Nano”, Nanobots are intelligent machines that are built on the Nano scale.

Nanobots are extremely helpful in medicine and industry. They can be preprogrammed to carry out a specific task. For example, Nanobots can be used to tackle oil pollution in a very effective manner, thus helping in cleaning up the environment.

AI (Artificial Intelligence) today greatly empowers Nanorobotics. We will talk about it in brief in a later section where we will see how AI and Nanotechnology converge with each other so effectively.

3. Biomaterials

Biomaterials are substances that are medically used for therapy of a disease or diagnosing one. They can include living tissue or artificially created material for use in biological systems to repair, replace or stimulate a damaged biological function. In the above video title, Biomimetic, as the word hints, implies mimicking the behaviour of any specific biological system.

Since Nanotechnology allows nanoscale level precision, it is a boon for developing Biomaterials. For example, in bone tissue engineering, ceramics, polymers, metals and composites can be developed at the nanoscale allowing extreme accuracy. Such nanophase biomaterials are of great significance in orthopaedic implants.

Why do we need Open Source Nanotechnology?

An article named “Make nanotechnology research open-source” on the Nature journal, encourages the adoption of an Open Source Approach and explains in a very simple manner about how innovation in Nanotechnology can be greatly hindered by patent abuse.

Excessive patenting in Nanotechnology leads to:

  • increases in costs,
  • slows down technical development and
  • removes fundamental knowledge from the public domain

There is also a separate section in the article titled “Open Source Alternatives” which clearly highlights how an Open Source Model would allow Nanotechnology companies to freely use the best tools, materials and devices available for carrying out research and using the technology without worrying about IP monopolies.

License fees would be eliminated thus reducing costs. Such savings can be used for innovating instead, which is very necessary for a company to survive. This openness also creates scope for small startups to enter the market and innovate in Nanotechnology research.

The field of nanotechnology is a combination of information (such as chemical formulae), software (for example, modelling tools) and hardware (such as atomic force microscopes). All three areas can adopt open-source principles, and some steps have already been taken towards this.

Pearce, J. M. (2012). Make nanotechnology research open-source. Nature, 491(7425), 519-521. doi:10.1038/491519a

An Open Science Approach towards Nanotechnology

Now that we have seen how Open Source revolutionizes Nanotechnology Research, let us discuss in detail about three important initiatives in Open Source Nanotechnology. If we recall from our first Science article, Open Science implies Open Source, Open Access, Open Data and Open Standards.

1. nanoHUB: A Massive Open Source Initiative in Nanotechnology

Also mentioned in the article that we just discussed, nanoHUB is an initiative that was initially begun in 2002 by the US National Science Foundation, who established a university network called the Network for Computational Nanotechnology (NCN), to support the National Nanotechnology Initiative.

nanoHUB.org has facilitated researchers, educators, and professionals to collaborate and share resources in order to solve nanotechnology problems.

NCN has three noble goals:

  • bringing computational tools online,
  • making the tools easy to use, and
  • educating users about the tools and nanoscience.

Read more about it on the paper here which has been written with an educational perspective. You can also read a more recent paper here which contains some useful references on nanoHUB.

2. caNanoLab: To speed up the use of nanotechnology in biomedicine

Another Open Source Initiative, caNanoLab enables data sharing to make the use of applied nanotechnology more convenient. The portal enables information sharing across research communities to accelerate and validate the use of nanotechnology in biomedicine.

Just like Bioinformatics (as we discussed in a previous Open Science article), Nanoinformatics has also emerged as a field of study. It primarily deals with any data related to Nanotechnology. After this data is collected and validated, it is stored and analyzed through various methods for useful applications. caNanoLab makes Nanoinformatics studies much easier.

3. NBI: Nano-Biomaterials Interactions Knowledgebase

The NBI Knowledgebase was created for use by the industry, academia, the general public, and regulatory agencies as a platform for an unbiased understanding of how biological systems are affected by nanomaterial exposure.

On the portal, you will find two sections:

Nanomaterial Library

Here you can look up a library of Nanomaterials with notable parameters like material type, core, surface chemistry, shape, charge, size and dendrimer generation.

Analysis of Nanomaterials

This section has all the parameters as in the library but with two additional options named Heatmap and Plot intended for analytical display.

Now that we spoke about three Open Science Initiatives in Nanotechnology, let’s conclude this section by leaving this link containing some exhaustive resources for Nanotechnology research. The page is intended for the eNanoMapper database which also contains links to other initiatives.

Open Source NanoAI: Open Convergence of AI and Nanotechnology

Its very obvious that somehow or the other, AI (Artificial Intelligence) and Nanotechnology had to make a convergence one day. This has opened up a whole new era of amazing possibilities.

Today’s AI can now solve many challenges faced by nanotechnologists. Some of the ways are listed as under:

  • Interpreting results obtained from Nanoscale experiments
  • Estimation of multiple parameters effectively
  • Automatic characterization of Nanomaterial properties and complex I/O responses
  • System optimization
  • Data and algorithm design for Nanocomputers

Read more about Artificial Intelligence and Nanotechnology here.

If we think of all of the above applications of AI in Nanotechnology with an Open Source Perspective, we can clearly perceive the elevated benefits. Nanotechnologists can collaborate effectively by sharing open information, source code and datasets that can efficiently speed up Nanotechnology research with applied AI.

So, Open Source NanoAI facilitates working with FOSS that involves software that implements both Artificial Intelligence and Nanotechnology.

Did you know that AI could use nanotechnology to create human-organs for replacement and repair of damaged ones, allowing people to live on longer? There’s more, AI can even be used to create artificial meat from nano-stem cells!

Stem cells, as we see, can very much be used for a higher purpose. With Nanotechnology, stem cells can be transformed into bone cells on command. The process can also be used to treat deadly conditions such as heart disease and Parkinson’s.

AI is a driving force in Nanorobotics in the application of therapeutics that involve the immune system. Nanobots can make use of unsupervised machine learning to identify damaged human cells. All of this is possible because AI programs can tell Nanobots about differentiating between good and bad cells with the help of a vast library included within the system consisting of all knowledge about Nanoinformatics and our human body.

Read more about the application of AI in Nanorobotics here.

Ongoing Research in Nanotechnology

Let’s now look into some interesting research work that has been happening in the field in recent times. We found many and picked two:

1. Formulation of nanoparticles to promote crop immunity

This can be applied in rice crop fields to make rice plants immune to fungi, which can prove to be great news for farmers and agriculturists.

doi: 10.1101/339283

2. Delivering medicine through nanocarriers via nose-to-brain

Nasal delivery of surface modified nanomedicines has been proposed for the treatment of several central nervous system conditions including:

  • Migraines
  • Sleep disorders
  • Viral infections
  • Brain tumors
  • Multiple sclerosis
  • Schizophrenia
  • Parkinson’s disease
  • Alzheimer’s disease
  • Obesity

Benefits of this research are that the side effects as in traditional drugs need not be worried about since nanocarriers can completely bypass the blood-brain barrier through nasal delivery.

doi: 10.3390/pharmaceutics10010034

Summary

So, in this Open Science and AI article, we introduced how Nanotechnology works and why it is an important field of study. We shared three examples, namely, Healthcare, Nanorobotics and Biomaterials. We then saw how Open Source Nanotechnology is a greater necessity to carry on tasks and research on the same more effectively.

Further on, we saw how an Open Science Approach drives Nanotechnology at a rapid pace. We saw three Open Science Initiatives, namely, nanoHUB, caNanoLab and NBI.

We also highlighted how AI and Nanotechnology converge for a common purpose and finally, we noted some ongoing research work in the field of Nanotechnology.

Thank you for reading. Please share any feedback that you would like to share in the comments section below. Our next article is going to be about 3d printing that would also include some discussions about interesting nanoscale applications.

Source

Download KaOS 2018.10

KaOS is an open source Linux distribution built around the KDE Plasma Workspaces and Application project, as well as the pacman package manager software from the Arch Linux operating system.

Distributed as a 64-bit Live DVD

The system is distributed as a single Live DVD ISO image that supports only 64-bit hardware platforms. It can be written to a blank DVD disc with any CD/DVD burning software, or a USB flash drive using the UNetbootin application.

KaOS is completely independent and provides users with a rolling-release system, which will make sure that their installations will always be up-to-date without requiring them to download a new ISO image and upgrade the entire OS.

Boot options

The boot medium allows users to run the live environment with support for Nvidia and AMD/ATI Radeon graphics cards, run a memory test, detect the hardware components, or boot the currently installed operating system.

Based on Arch Linux and built around KDE

As mentioned, the live session is powered by the KDE project, which provides a modern computing experience with a neat collection of hand picked open source application for common tasks.

Its main goal is to be small and fully focused on KDE and Qt technologies. It is based on the Arch Linux operating system and uses the pacman application as its default package manager for installing, removing and updating software packages.

One of its key features is the graphical installer provided on the Live DVD, which not only that allows novice users to install the operating system with only a few mouse clicks, but it also provides advanced configuration options for experienced Linux users.

Default applications

Default applications include the QupZilla web browser, Calligra office suite, Quassel IRC client, Krita digital painting software, Clementine music player, Plasma Media Center, Kdenlive video editor, Dragon Player video player, and many others.

Even if it’s based on the Arch Linux operating system, KaOS has its own software repositories comprised of Core, Main and Apps groups, which will give users quick access to some of the best and useful applications, libraries and core components.

Bottom line

If you like KDE and Arch Linux-based distributions, you should really give KaOS a try. Who knows, it might become your only operating system.

KDE desktop Linux distribution Operating system KDE Linux Distribution Distro

Source

New Custom Linux Distro is Systemd-Free, Debian-Based, and Optimized for Windows 10

Open SourceWindowsLinux

Posted by EditorDavid

on Saturday September 22, 2018 @11:34AM

from the Windows-shopping-at-the-Microsoft-Store dept.

An anonymous reader quotes

MSPowerUser:
Nearly every Linux distro is already available in the Microsoft Store, allowing developers to use Linux scripting and other tools running on the Windows Subsystem for Linux (WSL). Now another distro has popped up in the Store, and unlike the others it claims to be specifically optimised for WSL, meaning a smaller and more appropriate package with sane defaults which helps developers get up and running faster.

WLinux is based on Debian, and the developer, Whitewater Foundry, claims their custom distro will also allow faster patching of security and compatibility issues that appear from time to time between upstream distros and WSL… Popular development tools, including git and python3, are pre-installed. Additional packages can be easily installed via the apt package management system… A handful of unnecessary packages, such as systemd, have been removed to improve stability and security.

 

The distro also offers out of the box support for GUI apps with your choice of X client, according to

the original submission

.

WLinux is open source under the MIT license, and is available for free on GitHub. It can also be downloaded from Microsoft Store at a 50% discount, with the development company promising the revenue will be invested back into new features.

 

FORTUNE’S FUN FACTS TO KNOW AND TELL:
A guinea pig is not from Guinea but a rodent from South America.

Working…

Source

Git It Right » Linux Magazine

The Git version control system is a powerful tool for managing large and small software development projects. We’ll show you how to get started.

With its egalitarian spirit and tradition of strong community involvement, open source development doesn’t scale very well without some form of version control.

Over the past several years, Git [1] has settled in as the most viable and visible version control tool for the Linux space. Git was created by Linus Torvalds himself, and it got its start as the version control system for the Linux kernel development community (see the box entitled “The Birth of Git”). Since then, Git has been adopted by hundreds of open source projects and is the featured tool on several large code-hosting sites, such as GitHub.

Even if you aren’t a professional developer, if you work in the Linux space, you occasionally need to download and compile source code, and, more often than not, that means interacting with Git. Many Linux users pick up occasional Git commands on the fly without ever getting a formal introduction to what Git is and how it works. This article is the first in a two-part series aimed at building a better understanding of Git for everyday Linux users. This first article shows how to install Git, create a Git project, commit changes, and clone the repository to a remote location. Next month, you’ll learn some advance techniques for managing code in Git.

[…]

Use Express-Checkout link below to read the full article (PDF).

Source

The Professional Approach to Upgrading Linux Servers

With the release of Ubuntu 18.04, I thought it would be the perfect time to talk about server upgrades. Specifically, I’m going to share with you the process that I’m using to perform upgrades.

I don’t shy away from work, but I hate doing work that really isn’t needed. That’s why my first question when it comes to upgrades is:

Is this upgrade even necessary?

The first thing to know is the EOL (End of Life) for support for the OS you’re using. Here are the current EOLs for Ubuntu:

Ubuntu 14.04 LTS: April 2019
Ubuntu 16.04 LTS: April 2021
Ubuntu 18.04 LTS: April 2023

(By the way, Red Hat Enterprise Linux delivers at least 10 years of support for their major releases. This is just one example why you’ll find RHEL being used in large organizations.)

So, if you are thinking of upgrading from Ubuntu 16.04 to 18.04, consider if the service that server provides is even needed beyond April 2021. If the server is going away in the next couple of years, then it probably isn’t worth your time to upgrade it.

If you do decide to go ahead with the upgrade, then…

Determine What Software Is Being Used

Hopefully, you have a script or used some sort of documented process to build the existing server. If so, then you have a good idea of what’s already on the server.

If you don’t, it’s time to start researching.

Look at the running processes with the “ps” command. I like using “ps -ef” because it shows every process (-e) with a full-format listing (-f).

ps -ef

Look at any non-default users in /etc/passwd. What processes are they running? You can show the processes of a given user by using the “-u” option to “ps.”

ps -fu www-data
ps -fu haproxy

Determine what ports are open and what processes have those ports open:

sudo netstat -nutlp
sudo lsof -nPi

Look for any cron jobs being used.

sudo ls -lR /etc/cron*
sudo ls -lR /var/spool/cron

Look for other miscellaneous clues such as disk usage and sudo configurations.

df -h
sudo du -h /home | sort -h
sudo cat /etc/sudoers
sudo ls -l /etc/sudoers.d

Determine the Current Software Versions

Now that you have a list of software that is running on your server, determine what versions are being used. Here’s an example list for an Ubuntu 16.04 system:

  • HAProxy 1.6.3
  • Nginx 1.10.3
  • MariaDB 10.0.34

One way to get the versions is to look at the packages like so:

dpkg -l haproxy nginx mariadb-server

Determine the New Software Versions

Now it’s time to see what version of each piece of software ships with the new distro version. For ubuntu 18.04 you can use “apt show PKG_NAME”:

apt show HAProxy

To display just the version, grep it out like so:

apt show HAProxy | grep -i version

Here’s our list for Ubuntu 18.04:

  • HAProxy 1.8.81
  • Nginx 1.14.0
  • MariaDB 10.1.29

Read the Release Notes

Now, find the release notes for each version of each piece of software. In this example, we are upgrading HAProxy from 1.6.3 to 1.8.81. Most software these days conform to Semantic Versioning guidelines. In short, given a version number MAJOR.MINOR.PATCH, increment the:

MAJOR version when you make incompatible API changes,
MINOR version when you add functionality in a backwards-compatible manner, and
PATCH version when you make backwards-compatible bug fixes.

This means we’re the most concerned with major versions and somewhat concerned with minor versions, and we can pretty much ignore the patch version. This means we can think of this upgrade as being from 1.6 to 1.8.

Because it’s the same major version (1), we should be fine to just perform the upgrade. That’s the theory, anyway. It doesn’t always work in practice.

In this case, read the release notes for HAProxy versions 1.7 and 1.8. Look for any signs of backward compatibility issues such as configuration syntax changes. Also look for new default values and then consider how those new default values could affect the environment.

Repeat this process for the other major pieces of software. In this example that would be going from Nginx 1.10 to 1.14 and MariaDB 10.0 to 10.1.

Make Changes Based on the Release Notes

Based on the information from the release notes, make any required or desired adjustments to the configuration files.

If you have your configuration files stored in version control, make your changes there. If you have configuration files or modifications performed by your build scripts, make your changes there. If you aren’t doing either one of those, DO IT FOR THIS DEPLOYMENT/UPGRADE. 😉 Seriously, just make a copy of the configuration files and make your changes to them. That way you can push them to your new server when it’s time to test.

If you’re not sure what configuration file or files a given service uses, refer to its documentation. You can read the man page or visit the website for the software.

Also, you can list the contents of its package and look for “etc”, “conf”, and “cfg”. Here’s an example from an Ubuntu 16.04 system:

dpkg -L haproxy | grep -E ‘etc|cfg|conf’

The “dpkg -L” command lists the files in the package while the grep command matches “etc”, “cfg”, or “conf”. The “-E” option is for extended regular expressions. The pipe (|) acts as an “or” in regular expressions.

You can also use the locate command.

locate haproxy | grep -E ‘etc|cfg|conf’

In case you’re wondering, the main configuration file for haproxy is haproxy.cfg.

Install the Software on the Upgraded Server

Now install the major pieces of software on a new server running the new release of the distro.

Of course, use a brand new server installation. You want to test your changes before you put them into production.

By the way, if you have a dedicated non-production (test/dev) network, use it for this test. If you have a service on the server you are upgrading that connects to other servers/services, it’s a good idea to isolate it from production. You don’t want to accidentally perform a production action when you’re testing. This means you may need to replicate those other servers in your non-production environment before you can fully test the particular upgrade that you’re working on.

If you have deployment scripts you can use them to perform the installs. If you use Ansible or the like, use it against the new server. Or you can manually perform the install, making notes of all the commands you run so that you can put them in a script later on. For example, to manually install HAProxy on Ubuntu 18.04, run:

apt install -y haproxy

Next, put the configuration files in place.

Start the Services

If the software that you are installing is a service, make sure it starts at boot time.

sudo systemctl enable haproxy

Start the service:

sudo systemctl start haproxy

If your existing deployment script starts the service automatically, perform a restart to make sure that any of the new configuration file changes are being used.

sudo systemctl restart haproxy

See if the service is running.

sudo systemctl status

If it failed, read the error message and make the required corrections. Perhaps there is a configuration option that worked with the previous version that isn’t valid with the new version, for example.

Import the Data

If you have services that store data, such as a database service, then import test data into the system.

If you don’t have test data, then copy over your production data to the new server.

If you are using production data, you need to be very careful at this point.

1) You don’t want to accidentally destroy or alter any production data and…

2) You don’t want your new system taking any unwanted actions based on production data.

On point #1, you don’t want to make a costly mistake such as getting your source and destinations mixed up and end up overwriting (or deleting) production data. Pro tip: make sure you have good production backups that you can restore.

One point #2, you don’t want to do something like double charge the business’s customers or send out duplicate emails, etc. To this end, stop all the software and services that are not required for the import before you do it. For example, disable cron jobs and stop any in-house written software running on the test system that might kick off an action.

It’s a good idea to have TEST data. If you don’t have test data, perhaps you can use this upgrade as an opportunity to create some. Take a copy of the production data and anonymize it. Change real email addresses to fake ones, etc.

As previously mentioned, do your tests on a non-production network that cannot directly touch production.

Perform Service Checks

If you have a service availability monitoring tool (and why wouldn’t you???), then point it at the new server. Let it do its job and tell you if something isn’t working correctly. For example, you may have installed and started HAProxy, but perhaps it didn’t open up the proper port because you accidentally forgot to copy over the configuration.

Whether or not you have a service availability monitoring tool, use what you know about the service to see if it’s working properly. For example, did it open up the proper port or ports? (Use the “netstat” and “lsof” commands from above). Are there any error messages you should be concerned about?

If you’re at all familiar with the service, test it. If it’s a web server, does it serve up the proper web pages? If it’s a database server, can you run queries against it?

If you’re not the familiar with the service or a normal user of the service, it’s time to enlist help. If you have a team that is responsible for testing, hand it over to them. Maybe it’s time to for someone in the business who uses the service to check it out and see if it works as expected.

If you don’t have a regression testing process in place, now would be a good time to create one. The goal is to make changes and know that those changes haven’t broken the service. Upgrading the OS is a major change that has the potential to break things in a major way.

Prepare for Production

Once you’ve completed this entire process and tested your work, put all your notes into a production implementation plan. Use that plan as a checklist when you’re ready to go into production. It’s probably worth it to test your plan on another newly installed system to make sure everything goes smoothly. This is especially true when you are working on a really important system.

By the way, don’t think less of yourself for having a detailed plan and checklist. It actually shows your professionalism and commitment to doing good work.

For example, would you rather fly on a plane with a pilot who uses a checklist or one who just “wings it.” I don’t care how smart or talented that pilot is, I want them to double check their work when it comes to my life.

Yes, It’s a Lot of Work

You might be thinking to yourself, “Wow, this is a very tedious and time-consuming process.” And you’d be right.

If you want to be a good/great Linux professional, this is exactly what it takes. Attention to detail and hard work are part of the job.

The good news is that you get compensated in proportion to your professionalism and level of responsibility.

If it was fast and easy, everyone would be doing it, right?

Hopefully, this post gave you some ideas beyond just blindly upgrading and hoping for the best. 😉

Speaking of the best…. I wish you the best!

Jason

P.S. If you’re ready to level-up your Linux skills, check out the courses I created for you here.

Source

LMDE 3 “Cindy” Cinnamon – BETA Release – The Linux Mint Blog

This is the BETA release for LMDE 3 “Cindy”.

LMDE 3 Cindy

LMDE is a Linux Mint project and it stands for “Linux Mint Debian Edition”. Its main goal is for the Linux Mint team to see how viable our distribution would be and how much work would be necessary if Ubuntu was ever to disappear. LMDE aims to be as similar as possible to Linux Mint, but without using Ubuntu. The package base is provided by Debian instead.

There are no point releases in LMDE. Other than bug fixes and security fixes Debian base packages stay the same, but Mint and desktop components are updated continuously. When ready, newly developed features get directly into LMDE, whereas they are staged for inclusion on the next upcoming Linux Mint point release.

Important info:

The release notes provide important information about known issues, as well as explanations, workarounds and solutions.

To read the release notes, please visit:

Release Notes for LMDE 3

System requirements:

  • 1GB RAM (2GB recommended for a comfortable usage).
  • 15GB of disk space (20GB recommended).
  • 1024×768 resolution (on lower resolutions, press ALT to drag windows with the mouse if they don’t fit in the screen).

Notes:

  • The 64-bit ISO can boot with BIOS or UEFI.
  • The 32-bit ISO can only boot with BIOS.
  • The 64-bit ISO is recommended for all modern computers (Almost all computers sold since 2007 are equipped with 64-bit processors).

Upgrade instructions:

  • This BETA release might contain critical bugs, please only use it for testing purposes and to help the Linux Mint team fix issues prior to the stable release.
  • It will be possible to upgrade from this BETA to the stable release.

Bug reports:

  • Bugs in this release should be reported on Github at https://github.com/linuxmint/lmde-3-cinnamon-beta/issues.
  • Create one issue per bug.
  • As described in the Linux Mint Troubleshooting Guide, do not report or create issues for observations.
  • Be as accurate as possible and include any information that might help developers reproduce the issue or understand the cause of the issue:
    • Bugs we can reproduce, or which cause we understand are usually fixed very easily.
    • It is important to mention whether a bug happens “always”, or “sometimes”, and what triggers it.
    • If a bug happens but didn’t happen before, or doesn’t happen in another distribution, or doesn’t happen in a different environment, please mention it and try to pinpoint the differences at play.
    • If we can’t reproduce a particular bug and we don’t understand its cause, it’s unlikely we’ll be able to fix it.
  • Please visit https://github.com/linuxmint/Roadmap to follow the progress of the development team between the BETA and the stable release.

Download links:

Here are the download links for the 64-bit ISO:

A 32-bit ISO image is also available at https://www.linuxmint.com/download_all.php.

Integrity and authenticity checks:

Once you have downloaded an image, please verify its integrity and authenticity.

Anyone can produce fake ISO images, it is your responsibility to check you are downloading the official ones.

Enjoy!

We look forward to receiving your feedback. Many thanks in advance for testing the BETA!

Source

WP2Social Auto Publish Powered By : XYZScripts.com