Bisected: The Unfortunate Reason Linux 4.20 Is Running Slower

After running a lot of tests and then bisecting the Linux 4.20 kernel merge window, the reason for the significant slowdowns in the Linux 4.20 kernel for many real-world workloads is now known…

This latest Linux 4.20 testing endeavor started out with seeing the Intel Core i9 performance pulling back in many synthetic and real-world tests. This ranged from Rodinia scientific OpenMP tests taking 30% longer to Java-based DaCapo tests taking up to ~50% more time to complete to code compilation tests taking measurably longer to lower PostgreSQL database server performance to longer Blender3D rendering times. That happened with a Core i9 7960X and Core i9 7980XE test systems while the AMD Threadripper 2990WX performance was unaffected by the Linux 4.20 upgrade.

In some cases this Linux 4.20 slowdown is enough where the Threadripper 2990WX is able to pick up extra wins over the Core i9 7980XE.

 

When digging through more of my test system data, a set of systems I have running the latest Linux kernel Git benchmarks every other day also saw a significant pullback in performance from the early days of the Linux 4.20 merge window up through the very latest kernel code as of today. Those affected systems weren’t high-end HEDT boxes but included a low-end Core i3 7100 as well as a Xeon E5 v3 and Core i7 systems. AMD systems though still didn’t appear impacted. Those tests also found workloads like the Smallpt renderer to slowdown significant, PHP performance to take a major dive, and other scientific workloads like HMMer also faced a major setback compared to the current Linux 4.19 stable series.

Bisecting the Linux 4.20 kernel slowdown… The sizable difference during that process.

With seeing clear performance regressions on a number of systems when running the latest Linux 4.20 code, and especially with being able to reproduce it on high-core-count hardware (thus significantly cutting down the kernel build times), this morning I kicked off the kernel bisecting process to see why this new kernel is causing many workloads to run so much slower than Linux 4.19. With the Phoronix Test Suite doing the heavy-lifting, the problematic commit was quickly uncovered.

Going into this testing my thinking was perhaps an Intel P-State CPU frequency scaling driver regression as something that has caused some performance regressions in the past or perhaps a scheduler change. There’s also been a lot of Linux 4.20 changes in general that some unintentional regression must have slipped in there somewhere primarily hurting the Intel Linux performance… As a reminder, Linux 4.20 is the biggest kernel release of the year in terms of lines of code changed with more than 354 thousand lines of new code added at the end of October when this merge window opened.

 

As outlined in the Linux 4.20 feature overview, there are a lot of exciting changes with this kernel. But why is it slower? More work on f!*#(# Spectre!

Source

Installing and Using AWS CLI on Ubuntu

AWS offers an enormous range of services and to launch even the simplest of these services require numerous steps. You will soon find that time spent on AWS console (the Web UI) is time well wasted. While I don’t condone this design and wish for something simpler, I do realize that most of us are stuck with AWS because our organization chose it as their platform for one reason or another.

Instead of complaining about it, let’s try and limit our attention to a small set of services that an organization typically uses. This may be ECS, AWS Lambda, S3 or EC2. One way of doing it is by using the AWS CLI. It offers you a way to easily integrate AWS interface with your everyday work flow. Once you get over the initial hurdle of setting up the CLI and getting used to a few commands, this will save you hours and hours of time. Time that you can spend on much more pleasant activities.

This tutorial assumes that you already have an AWS account. This can be an IAM user account with programmatic access issued by your organization. If you have your own personal account with AWS then do not use your AWS root credentials for the CLI! Instead create an IAM user with programmatic access for all CLI related stuff. When deciding for policy that you will attach to this new user, think about what you want to do with this account.

The most permissive policy is that of Administrative Access, which I will be using. As you create an IAM user gets assigned a username, an Access ID and a Secret ID Key. Keep the latter two confidential.

For my local environment, I will be using Ubuntu 18.04 LTS.

Installing AWS CLI

Ubuntu 18.04 LTS comes with Python 3.6 preinstalled and you can install pip package manager to go with this by running (if you wish for an apt package for the CLI, read further below for a note on that):

$ sudo apt install python3-pip

If you are running Python 2.6 or earlier, then replace python3-pip with python-pip. AWS CLI is shipped as a pip package so we will need it. Once installed use pip to install the cli.

Once again, if you are using Python 2, replace pip3 with pip. If you want you can use, sudo apt install awscli to install aws cli as well. You will be a couple of revisions behind but it is fine. Once it is installed relaunch the bash session.

Configuring the Environment

Assuming you don’t have your IAM access keys, you can either ask your organization’s AWS Root user to create one for you or if you are using your own personal account and are your own root admin, then open up the IAM Console in your browser.

Go to the “Users” tab and select the User Account you want to use to access the CLI. Go to “Security Credentials” and create access key and secret access key. Never share this key with anyone, and make sure you don’t push them along with your git commits, etc.

Use these keys as the command below prompts you to enter their respective values:

Output:

AWS Access Key ID [None]: ADSLKFJAASDFKLJLGA
AWS Secret Access Key [None]: lkdsfh490IODSFOIsGFSD98+fdsfs/fs
Default region name [None]: us-west-2
Default output format [None]: json

The value for access key and secret key will obviously be different in your case. When it comes to region, choose the one that is closest to you (or your users). For output JSON format is fine. Once you have entered valid information for all the values your CLI is ready to interface with the AWS remotely.

The ID and secret as well as other config parameters are stored in a subdirectory inside your home directory ~/.aws. Make sure that it doesn’t get compromised. If it does get compromised, immediately revoke the ID and associated key using the IAM Console.

To login to different machines, you can always create more of these.

Using the CLI

This is the part where you need to do go through the man pages. Fortunately, the CLI is well-documented. Each service is its own command and then various actions that you can perform using that particular service are listed under its own help section.

To illustrate this point better, let’s start with:

If you scroll down in the output page, you will see all the services listed:

Output:

AVAILABLE SERVICES
o acm
o acm-pca
o alexaforbusiness
o apigateway
.
.
.
o dynamodb
o dynamodbstreams
o ec2
o ecr
o ecs
o efs
o eks

Now, let’s say you want to use Amazon EC2 service to launch your EC2 instances. You explore further by going to:

This will get you all sorts of subcommand that you could use for creating snapshots, launching fleets of VMs, managing SSH-keys, etc. However, what your application would demand is something that is for you to decide upon. Of course, the list of commands, subcommands and, valid arguments that can be used is in fact quite long. But you probably won’t have to use every option.

Conclusion

If you are just starting out, I’d recommend begin with the console for launching various instances and managing them. This will give you a pretty good idea of what option to look for when using the CLI. Eventually, as you use more and more of the CLI, you can start writing scripts to automate the entire resources creation, management and deletion process.

Don’t force yourself into learning about it. These things take time to sink in.

Source

Kodak’s new 3D printer has a Raspberry Pi inside

Kodak has launched a Raspberry Pi 3 based Kodak Portrait 3D Printer with a dual-extrusion system, multiple filament types, a 5-inch touchscreen, and WiFi and Ethernet connections to a Kodak 3D Cloud service.

Kodak and Smart Int’l. have collaborated on a professional, dual extrusion Kodak Portrait 3D Printer that runs a Linux-based 3DprinterOS on a Raspberry Pi 3 board. The $3,500 device offers connections to a Kodak 3D Cloud service, and is designed for engineering, design, and education professionals.

Kodak Portrait 3D Printer
(click images to enlarge)

 

Like the BeagleBone-based Autodesk

Ember 3D printer

, the Kodak Portrait 3D Printer is based on a popular Linux hacker board. In this case, it’s a Raspberry Pi 3 SBC running Kodak’s Linux-based 3DprinterOS print management software.

Other Raspberry Pi based 3D printers include the industrial-oriented, $4,500 and up AON 3D Printer. There are also a variety of Raspberry Pi 3D printer hacking projects available, many of which use OctoPrint’s RPi-compatible software.

Kodak Portrait 3D Printer dual extrusion system (left) and interior view
(click images to enlarge)

 

The Kodak Portrait 3D Printer has a dual extrusion system with a 1.75mm filament diameter and automatic nozzle lifting. The extrusion system provides swappable PTFE and all-metal hotends “for optimal material compatibility,” says Kodak.

The printer provides a 0.4mm nozzle with 20-250 micron layer resolution and XCYZ accuracy of 12.5, 12.5, 2.5 microns. It also offers 16mm XY motion and 12mm Z motion. A sensor warns you when your filament is almost gone.

Kodak Portrait 3D Printer, front and back
(click images to enlarge)

 

Materials include different grades of PLA, as well as ABS, Flex 98, HIPS, PETG, water soluble PVA, and two grades of Nylon. The Kodak manufactured materials are available in a wide color palette, including Kodak’s Trade Dress Yellow. They are claimed to offer low moisture packaging and high dimensional accuracy.

The 455 x 435 x 565mm printer has an all-steel structure allowing high-temperature builds, with support for up to 105ºC build plate and up to 295ºC nozzle temperatures. A fully-enclosed print chamber with a 200 x 200 x 235mm build volume features a HEPA and activated-carbon filter, thereby “reducing unwanted odors and keeping fingers away from hot moving parts,” says Kodak. Other features include magnetically attached print surfaces.

Kodak Portrait 3D Printer (left) and touchscreen
(click images to enlarge)

 

The Kodak Portrait 3D Printer is equipped with a 5-inch, 800 x 480 color touchscreen, as well as WiFi, Ethernet, a USB port, and a build chamber camera. Using 3DprinterOS, you can manage print settings such as automatic leveling and calibration, and you can preset print parameters for every material.

The Linux-based software provides free access to the Kodak 3D Cloud service, where you can manage a print farm for multiple machines from anywhere in the world. Users can access “slice online, monitor their prints and receive over-the-air updates,” says Kodak.

Kodak Portrait 3D Printer video demo

Further information

The Kodak Portrait 3D Printer is available now for $3,499 in Europe and the U.S. More information may be found in Kodak’s announcement, as well as its product and shopping pages.

Source

Snaps are the new Linux Apps that work on every Distro

Ask anyone that is using any operating that is mainstream, be it on PCs or mobile. Their biggest gripe is apps, finding useful and functional apps when using anything other than MacOS, Windows, Android or iOS is a serious hustle. Those of us trying our feet in the murky Linux ecosystem are not spared.

For a long time, getting apps for your Linux computer was an exercise in futility. This issue was made even worse with just how fragmented the Linux ecosystem is. This drove most of us to the relatively more mainstream distros like Ubuntu and Linux Mint for their relatively active developer community and support.

Advertisement – Continue reading below

See, when using Linux, you couldn’t exactly Google the name of a program you want, then download the .exe file, double click it and it is installed like you would on Windows (although technically you can do that now with .deb files). You had to know your way around the Terminal. Once in the Terminal, like for the case of Ubuntu, you needed to add the software source to your Repository with sudo apt commands, then now update the cache, then finally install the app you want with sudo apt-get install. In most cases, the dependencies would be all messed up and you’d have to scroll through endless forums trying to figure out how to fix that one pesky dependency that just won’t allow your app to run well.

You’d jump through all these hoops and then finally the app would run, but then it would look all weird because maybe it wasn’t made for your distro. Bottom line, it takes patience and resilience to install Linux Apps.

Snaps

Snaps are essentially applications that are compressed together with their dependencies and descriptions of how to run and interact with other software on the system that they are installed on. Snaps are secure in that, they are mainly designed to be sandboxed and isolated from other system software.

Snaps are easily installable, upgradeable, degradable, and removable irrespective of its underlying system. For this reason, they are easily installed on basically any Linux-based system. Canonical is even developing Snaps as the new packaging medium for Ubuntu’s Internet of Things devices and large container deployments referred to as Ubuntu Core.

How to Install Snap in Linux

In this section, I will show you to install Snap in Linux and how to use snap to install, update or remove packages. Ubuntu has been shipping distros since Ubuntu 16.04 with Snap already pre-installed on the system. Any Linux distro based on Ubuntu 16.04 and newer doesn’t need to install again. For other distribution, you can follow instructions as shown:

On Arch Linux

$ sudo yaourt -S snapd
$ sudo systemctl start snapd.socket

On Fedora

$ sudo dnf copr enable zyga/snapcore
$ sudo dnf install snapd
$ sudo systemctl enable –now snapd.service
$ sudo setenforce 0

Once snap has been installed and started, you can list all available packages in the snap store as shown.

$ snap find

To search for a particular package, just specify package name as shown.

$ snap find package-name

To install a snap package, specifying the package by name.

$ sudo snap install package-name

To update an installed snap package, specifying the package by name.

$ sudo snap refresh package-name

To remove an installed snap package, run.

$ sudo snap remove package-name

To learn more about snap packages, go through Snapcraft’s official page or head on out to the Snap Store to explore the bunch of apps that are already available.

I feel like Snaps are growing to be more like the Google Play Store. A central place for Linuxers, irrespective of which fork of Linux they’re running to come to get apps that just work, and do so with little to no fuss at all. At the moment, there are thousands of snaps that are used by millions of people across 41 Linux distributions. This number is only going to grow bigger. If there’s ever a good time to switch to Linux, it is now. The platform really has come of age.

Source

5 Easy Tips for Linux Web Browser Security | Linux.com

If you use your Linux desktop and never open a web browser, you are a special kind of user. For most of us, however, a web browser has become one of the most-used digital tools on the planet. We work, we play, we get news, we interact, we bank… the number of things we do via a web browser far exceeds what we do in local applications. Because of that, we need to be cognizant of how we work with web browsers, and do so with a nod to security. Why? Because there will always be nefarious sites and people, attempting to steal information. Considering the sensitive nature of the information we send through our web browsers, it should be obvious why security is of utmost importance.

So, what is a user to do? In this article, I’ll offer a few basic tips, for users of all sorts, to help decrease the chances that your data will end up in the hands of the wrong people. I will be demonstrating on the Firefox web browser, but many of these tips cross the application threshold and can be applied to any flavor of web browser.

1. Choose Your Browser Wisely

Although most of these tips apply to most browsers, it is imperative that you select your web browser wisely. One of the more important aspects of browser security is the frequency of updates. New issues are discovered quite frequently and you need to have a web browser that is as up to date as possible. Of major browsers, here is how they rank with updates released in 2017:

  1. Chrome released 8 updates (with Chromium following up with numerous security patches throughout the year).
  2. Firefox released 7 updates.
  3. Edge released 2 updates.
  4. Safari released 1 update (although Apple does release 5-6 security patches yearly).

But even if your browser of choice releases an update every month, if you (as a user) don’t upgrade, that update does you no good. This can be problematic with certain Linux distributions. Although many of the more popular flavors of Linux do a good job of keeping web browsers up to date, others do not. So, it’s crucial that you manually keep on top of browser updates. This might mean your distribution of choice doesn’t include the latest version of your web browser of choice in its standard repository. If that’s the case, you can always manually download the latest version of the browser from the developer’s download page and install from there.

If you like to live on the edge, you can always use a beta or daily build version of your browser. Do note, that using a daily build or beta version does come with it the possibility of unstable software. Say, however, you’re okay with using a daily build of Firefox on a Ubuntu-based distribution. To do that, add the necessary repository with the command:

sudo apt-add-repository ppa:ubuntu-mozilla-daily/ppa

Update apt and install the daily Firefox with the commands:

sudo apt-get update

sudo apt-get install firefox

What’s most important here is to never allow your browser to get far out of date. You want to have the most updated version possible on your desktop. Period. If you fail this one thing, you could be using a browser that is vulnerable to numerous issues.

2. Use A Private Window

Now that you have your browser updated, how do you best make use of it? If you happen to be of the really concerned type, you should consider always using a private window. Why? Private browser windows don’t retain your data: No passwords, no cookies, no cache, no history… nothing. The one caveat to browsing through a private window is that (as you probably expect), every time you go back to a web site, or use a service, you’ll have to re-type any credentials to log in. If you’re serious about browser security, never saving credentials should be your default behavior.

This leads me to a reminder that everyone needs: Make your passwords strong! In fact, at this point in the game, everyone should be using a password manager to store very strong passwords. My password manager of choice is Universal Password Manager.

3. Protect Your Passwords

For some, having to retype those passwords every single time might be too much. So what do you do if you want to protect those passwords, while not having to type them constantly? If you use Firefox, there’s a built-in tool, called Master Password. With this enabled, none of your browser’s saved passwords are accessible, until you correctly type the master password. To set this up, do the following:

  1. Open Firefox.
  2. Click the menu button.
  3. Click Preferences.
  4. In the Preferences window, click Privacy & Security.
  5. In the resulting window, click the checkbox for Use a master password (Figure 1).
  6. When prompted, type and verify your new master password (Figure 2).
  7. Close and reopen Firefox.

4. Know your Extensions

There are plenty of privacy-focused extensions available for most browsers. What extensions you use will depend upon what you want to focus on. For myself, I choose the following extensions for Firefox:

  • Firefox Multi-Account Containers – Allows you to configure certain sites to open in a containerized tab.
  • Facebook Container – Always opens Facebook in a containerized tab (Firefox Multi-Account Containers is required for this).
  • Avast Online Security – Identifies and blocks known phishing sites and displays a website’s security rating (curated by the Avast community of over 400 million users).
  • Mining Blocker – Blocks all CPU-Crypto Miners before they are loaded.
  • PassFF – Integrates with pass (A UNIX password manager) to store credentials safely.
  • Privacy Badger – Automatically learns to block trackers.
  • uBlock Origin – Blocks trackers based on known lists.

Of course, you’ll find plenty more security-focused extensions for:

Not every web browser offers extensions. Some, such as Midoria, offer a limited about of built-in plugins, that can be enabled/disabled (Figure 3). However, you won’t find third-party plugins available for the majority of these lightweight browsers.

5. Virtualize

For those that are concerned about releasing locally stored data to prying eyes, one option would be to only use a browser on a virtual machine. To do this, install the likes of VirtualBox, install a Linux guest, and then run whatever browser you like in the virtual environment. If you then apply the above tips, you can be sure your browsing experience will be safe.

The Truth of the Matter

The truth is, if the machine you are working from is on a network, you’re never going to be 100% safe. However, if you use that web browser intelligently you’ll get more bang out of your security buck and be less prone to having data stolen. The silver lining with Linux is that the chances of getting malicious software installed on your machine is exponentially less than if you were using another platform. Just remember to always use the latest release of your browser, keep your operating system updated, and use caution with the sites you visit.

Source

How To Install Atom Text Editor on Ubuntu 18.04

Atom is an open source cross-platform code editor developed by GitHub. It has a built-in package manager, embedded Git control, smart autocompletion, syntax highlighting and multiple panes.

Under the hood Atom is a desktop application built on Electron using HTML, JavaScript, CSS, and Node.js.

The easiest and recommended way to install Atom on Ubuntu machines is to enable the Atom repository and install the Atom package through the command line.

Although this tutorial is written for Ubuntu 18.04 the same instructions apply for Ubuntu 16.04 and any Debian based distribution, including Debian, Linux Mint and Elementary OS.

Prerequisites

The user you are logging in as must have sudo privileges to be able to install packages.

Installing Atom on Ubuntu

Perform the following steps to install Atom on your Ubuntu system:

  1. Start by updating the packages list and install the dependencies by typing:

    sudo apt update
    sudo apt install software-properties-common apt-transport-https wget

  2. Next, import the Atom Editor GPG key using the following wget command:

    wget -q https://packagecloud.io/AtomEditor/atom/gpgkey -O- | sudo apt-key add –

    And enable the Atom repository by typing:

    sudo add-apt-repository “deb [arch=amd64] https://packagecloud.io/AtomEditor/atom/any/ any main”

  3. Once the repository is enabled, install the latest version of Atom with:

Starting Atom

Now that Atom is installed on your Ubuntu system you can launch it either from the command line by typing code or by clicking on the Atom icon (Activities -> Atom).

When you start the Atom editor for the first time, a window like the following should appear:

You can now start installing themes and extensions and configuring Atom according to your preferences.

Upgrading Atom

To upgrade your Atom installation when new releases are published, you can use the apt package manager normal upgrade procedure:

sudo apt update
sudo apt upgrade

Conclusion

You have successfully installed Atom on your Ubuntu 18.04 machine. To learn more about how to use Atom, from beginner basics to advanced techniques, visit their official documentation page.

If you have any question, please leave a comment below.

Source

Official Google Twitter account hacked in Bitcoin scam

Source: Naked Security/Sophos

Hacks/Cracks
The epidemic of Twitter-based Bitcoin scams took another twist this week as attackers tweeted scams directly from two verified high-profile accounts. Criminals sent posts from both Google’s G Suite account and Target’s official Twitter account. Cryptocurrency giveaway scams work by offering money to victims. There’s a catch, of course: They must first send a small amount of money to ‘verify their address’. The money in return never shows up and the attackers cash out.

Source

WP2Social Auto Publish Powered By : XYZScripts.com