Pioneers in Open Source–Eren Niazi, Part I: the Start of a Movement and the Open-Source Revolution Redefining the Data Center

The name may not be a familiar one to everyone,
but Eren Niazi can be credited with
laying the foundation and paving the way to the many software-defined and
cloud-centric technologies in use today.

When considering the modern data center, it’s difficult to imagine a time when
open-source technologies were considered taboo or not production-grade, but
that time actually existed. There was a time when the data center meant closed
and propriety technologies,
developed and distributed by some of the biggest names in the industry—the days when EMC, NetApp, Hewlett Packard (HP), Oracle or even Sun
Microsystems owned your data center and the few applications upon which you
heavily relied.
It also was a time when your choice was limited to one vendor, and you would invest
big into that single vendor. If you were an HP shop, you bought HP. If you were
an EMC shop, you
bought EMC—and so on. From the customer’s point of
view, needing to interact with only a single vendor for purchasing, management
and support was comforting.

However, shifting focus back to the present, the landscape
is quite different. Instead, you’ll find an environment of mixed
offerings provided by an assortment of vendors, both large and small.
Proprietary machines work side by side with off the shelf commodity devices
hosting software-defined software, most of which are built on top of
open-source code. And half the applications are hosted in virtual machines
over a Hypervisor or just spun up in one or more containers.

These changes didn’t happen overnight. It took visionaries like Eren Niazi
to identify the full potential of open-source software technologies. He saw
what others did not and, in turn, proved to an entire industry that open
source was not merely production-ready, but he also used that same technology to
redefine the entire data center.

His story is complicated, filled with ups and downs. Eren faced his
fair share of trials and tribulations that gave him everything, just to have it
all taken away. But, let’s begin at the beginning.

Born in Sunnyvale, California, a little more than 40 years ago, Eren grew up down the
street from Steve Jobs, and on many occasions, he engaged the legendary
Apple co-founder in inspiring conversations. The two shared many
characteristics. Neither ever finished college. Both are
entrepreneurs and inventors. Niazi and Jobs each were driven from their own
companies, only to return again. Around age 12, Eren became
fascinated with computers and learned how to develop code. However, his
adventures in open-source technologies didn’t truly start until the year
1998.

Jim Truong took the young Niazi, a teenager with no college education, under
his wing over at AMAX Engineering, a server and cluster computing company.
Founded in 1979, AMAX Engineering Corporation designs and engineers customized
platforms for data centers. Today, it has expanded to provide solutions to
host cloud, big data and high-performance parallel computing workloads.

At age 19, Niazi was working diligently on architecting
supercomputers for large account customers, which included the federal
government and Linux Networx. By the end of his career with AMAX, Eren had
risen to the level of OEM group manager.

I was fortunate in that I was able to reach Jim for comment:

I met Eren back in 1999 when I hired him at AMAX Engineering. Even then, at 19,
he was extremely passionate about technology. He was self-taught and even
learned to write code on his own. Eren was very motivated and wanted to learn
everything. The question was never about how, but how fast. Once he set his
sights on a goal, Eren would be 110% committed.

Deep down, I always knew that he was going to be an entrepreneur. I just never
imagined that he would go on to accomplish so much in the open source space. At
the time, everyone else was treating open source software as a pet project and
configuring machines to run simple tasks out of their homes. Eren took that
same technology and proved it to be production grade. He used it to compete
with Enterprise level solution providers in the data storage space but at a
large fraction of the cost.

While Eren was at AMAX, he took notice of a trend in the technology of this
sector and observed the path in which it was heading. This would lead to a
unique vision for open source integration. The vision may not sound so unique
today, but at the time it went against the norms just enough to be considered
revolutionary. In 2001, he created Open Source Storage, Inc., which focused on
leveraging commodity off-the-shelf hardware and pairing it with open source
software while pushing into Enterprise space.

In 2001, Eren left AMAX and founded Open Source
Storage, Incorporated, or OSS. At the time, “open-source” anything was
still considered somewhat controversial—even more so in the professional
workplace. But, that did not stop or dissuade the young Eren from pressing on.

Some might even say that Eren could be credited with coining the terms open source
<fill in the blank>. The same sentiment was both expressed
and validated by Jim Truong of AMAX:
“Eren worked hard to pave the way for the open source storage movement (a term
he coined), and he can probably be credited for getting us to where we are
today. Not many individuals can achieve what he did.”

And there probably is some truth to this. Eren Niazi continues to hold many
domain names, most of which were acquired 17 or more years ago. For example, a
whois on opensourcesystems.com dates back to 1999, while a
whois on a
opensourcestorage.com shows a creation year of 2001:

$ whois opensourcesystems.com|grep Creation|head -n1
Creation Date: 1999-01-01T05:00:00Z
$ whois opensourcestorage.com|grep Creation|head -n1
Creation Date: 2001-12-06T03:19:35Z

Niazi still holds the ownership of those same domains.

""


Figure 1. A Few Domains Owned by
Eren Niazi

A trademark for Open Source Storage was filed on January 5, 2004 and registered
on June 21, 2005.

""

Figure 2. Serial Number: 78347754 and
Registration Number: 2963234

OSS hit the ground running. The company did the unthinkable by marrying both
open-source software with commodity off-the-shelf hardware and, in turn, sold it
as a cheaper alternative to the big players and newcomers to the industry.

Friendster, one of the original social-networking sites, was one of the early
OSS customers. The social network needed both hardware and a scalable platform.
OSS was able to fill that void and at a very competitive price. It wouldn’t
be long before the Friendster employees left for the new kid on the block,
Facebook. Those former Friendster employees provided OSS with a wonderful
business opportunity. Facebook was growing, and it was growing fast. The year
was 2004. With its foot already in the door, OSS deployed its software
stack on top of 3500 systems and was with Facebook during its early growth
years—at least until 2007.

Note: Friendster is currently a social gaming site, but that wasn’t always the
case. Friendster was originally founded as a social networking site in 2002.
The relaunch into the social gaming platform occurred much later in
2011.

""


Figure 3. Niazi Standing in Front of
Server Cabinets in the Early Facebook Data Center

Stories circulated stating how Mark Zuckerberg had invited Eren
Niazi to accompany him to the Nasdaq on the day he rang the opening bell to
mark Facebook’s IPO. While Niazi and Zuckerberg were very close, this story was
nothing more than just a simple rumor. Regardless of that fact, Eren did take
advantage of the opportunity by purchasing pre-sale Facebook IPO stock via a
registered stock broker.

Open Source Storage had accomplished the unthinkable and commercialized open-source software. Open source was ready for enterprise. Taking notice, the
industry shifted toward it. By the year 2007, the company’s list of
customers grew to include the following:

  • Friendster
  • Facebook
  • NASA
  • Shutterfly
  • FriendFinder
  • Yahoo
  • eBay
  • Shopping.com (later acquired by eBay)
  • USGS
  • Lockheed Martin
  • US Army
  • And more…

When reached for comment, the former OSS warehouse manager Marty Wallach
validated the above list of customers. In his brief, almost two-year tenure
with the company, Marty wore many hats. His main responsibilities
circulated around inventory, logistics and vendor or client orders. He secured
the components and hardware prior to it being assembled and shipped to
customers like Facebook. He also took many trips to the old Facebook offices
located on University Avenue and even to Shutterfly.

With regard to his time spent working with Eren, he said “I have known Eren a long time and he has always been up to date with the
technology. His background has always been impressive and he has tremendous
drive.”

Although I’ve gone on about the high-profile customers OSS accumulated
through the years, I’d like to take a step back and look at the actual product. By
today’s standards, it isn’t anything new. Today, people use the term
“software-defined” to label what OSS had done a decade earlier.

Software-defined solutions were not a thing in those days, and yet, it was
exactly what OSS was building and selling. The software was a CentOS Linux
respin. A Kickstart machine would load the predefined operating system image
and the minimum set of packages required.

Note: software-defined solutions involve the coupling of special-purpose software
with commodity off-the-shelf hardware. Coined around 2009 (maybe a little
later), it has been a hot and trending technology in today’s data
center.

Initially, Open Source Storage was building its own hardware (using
off-the-shelf components), all based on the open standards of the time. This
did provide its advantages. For example, the high-efficiency power supplies
were generating 50% less heat and consuming significantly less energy (between
30%–50%). To enable the hardware that OSS provided to early
customers, the motherboard’s BIOS needed to be rewritten, and the company
worked
closely with both Intel and AMD to accomplish this. In fact, the first OSS
office was located across the street from Intel in Santa Clara, California.

The internet exploded with a plethora of services, applications and platforms
of entertainment. Data centers were only getting bigger with a lot more
hardware. There was a constant need to reduce heat and, in turn, save on
cooling costs. The Gemini 2U was one of the more-green offerings of its time.

""

Figure 4. Open Source Storage
Recognition and Awards from the Early Years

Labeled as the Gemini 2U, the ultimate system was fixed with dual
motherboards and other fixtures, located in the same enclosure. A patent was
filed in 2006 and accepted in 2008 (US 20080037214):

According to this embodiment, the chassis features a chassis base, first and
second bays for first and second motherboards, a fan assembly for mounting
fans, a backplane for I/O connections mounted to the chassis base, and at least
two compartments for electronic components. The first bay and the second bay
are laterally adjacent so that, when in use, the first and second motherboards
are in substantially the same plane.

""


Figure 5. The Gemini 2U’s
Patent Design Number US 20080037214

Note: a U is a form of measurement designating the height of a computer enclosure and
data center rack cabinet. A single U measures at 1.75 inches in height.
Therefore a 2U would equate to 3.5 inches. At around this time frame, single 2U
enclosure would be capable of holding up to 12 3.5″ spinning hard disk
drives (HDDs).

It wouldn’t take long for companies like HP and Supermicro to copy this
unique twin-server design, but because OSS did not have the money to litigate,
those hardware vendors continued to sell the design in their respective
product lines. In the case of HP, the design was first introduced in its
Proliant series.

""

Figure 6. An Advertisement for the Gemini
2U

As OSS’s operations grew, the need for a larger facility became
increasingly important. In 2004, the company moved its headquarters to the
former 33,000 square foot Atari facility located at 1195 Borregas Avenue in
Sunnyvale, California. The company continued to operate from that location until
2007.

""

Figure 7. OSS Headquarters in
Sunnyvale, California (2004–2007)

Business was booming, and OSS was seeing an annual run rate of $40
million—not
bad when think about the fact that the entire business was built with credit
cards and a minimal amount of money to bootstrap itself. Eren and OSS were
turning heads, and an entire industry took notice.

""


Figure 8. Open Source Storage
Featured in Silicon Valley Business Journal

""


Figure 9. Open Source Storage
Featured in Custom Systems Magazine

The company did very well until 2007. It grew so rapidly, it needed to
get additional capital from investors, and then the recession hit. Once the
recession hit, the investors wanted Niazi to sell, but he wouldn’t budge.
As a result, those same investors pulled out of the company.

Here’s Eren Niazi on this topic:

It was never about the money. Over the years, I was given many opportunities to
sell OSS and refused to sell the business to Oracle, HP or IBM.
It was never a business. It was a movement. To
this day, I would take a bullet for the company.

With little to no capital left, Open Source Storage filed for bankruptcy. And
although OSS was going through its own financial crisis, it did not impact the
entire business—meaning, OSS continued to maintain its clients, but a
strategic move was required.

By 2007 and following the bankruptcy, the business model needed to change, and it
did in order to focus more on enterprise-grade turnkey open-source software solutions
intended for public, private and hybrid cloud deployments.
To put this in perspective, it was in 2006 that Amazon’s subsidiary,
Amazon Web Services (AWS) first introduced its Elastic Compute Cloud (EC2).

The decision was made to focus only on the software and not the hardware with
an additional emphasis on Agile development. In fact, the industry already
was starting to trend toward this model. Niazi and his team looked beyond the complete
operating system model to develop more of the middleware needed by their
customers—a majority of which were migration tools to ease the transition from
proprietary platforms and to their open-source counterparts. For instance, why
continue spending big dollars with Oracle and Sun Microsystems when you can
cut your costs by 80% and instead host that same data with MySQL on top of
FreeNAS? Customers enjoyed the idea of getting away from these data center
monopolies. Needless to say, this eventually created tension between OSS
and Oracle.

In parallel, the new customers being catered to under this model were
startups—about 75 of them to be exact. OSS was contracted to build “apps” for
them. The process began with soft coding and prototyping to fill the initial
requirements requested by the customer, and when the startup was fully funded,
OSS then would build the hardened application. In-house, there were more than 200
developers (contractors) commissioned to handle the bulk of this work. It was a
relatively large operation.

One satisfied OSS customer (in around 2014), who I’ll refer to as William G.,
provided the following testimony:

We were introduced to Eren through a mutual friend and shortly thereafter flew
out to California to meet the team. Our company was building an interactive
Music Trading Card platform. Open Source Storage accomplished exactly what we
needed, and we were very happy with them. They built an open source platform
that scaled and within the agreed upon time frame.

It would take a creative genius to see the true potential in open-source
software and prove to an entire industry that was it production-grade and fully
capable of hosting consumer workloads. This piece of history was only the
beginning. A prosperous Niazi begins to
buckle under the pressure, the effects of which impact OSS and the very
movement he began more than a decade earlier. The rest of his turbulent
story will unfold in Part II.

The revolution in the data center had taken place, and the foundation
was laid for what was about to come. Stay tuned.

Source

Celebrating 15 Years of the Xen Project and Our Future | Linux.com

In the 1990s, Xen was a part of a research project to build a public computing infrastructure on the Internet led by Ian Pratt and Keir Fraser at The University of Cambridge Computer Laboratory. The Xen Project is now one of the most popular open source hypervisors and amasses more than 10 million users, and this October marks our 15th anniversary.

From its beginnings, Xen technology focused on building a modular and flexible architecture, a high degree of customizability, and security. This security mindset from the outset led to inclusion of non-core security technologies, which eventually allowed the Xen Project to excel outside of the data center and be a trusted source for security and embedded vendors (ex. Qubes, Bromium, Bitdefender, Star Labs, Zentific, Dornerworks, Bosch, BAE systems), and also a leading hypervisor contender for the automotive space.

As the Xen Project looks to a future of virtualization everywhere, we reflect back on some of our major achievements over the last 15 years. To celebrate, we’ve created an infographic that captures some of our key milestonesshare it on social.

Read more at The Linux Foundation

Source

Linux Today – Oracle Moves to Gen 2 Cloud, Promising More Automation and Security

Oct 23, 2018, 09:00

(Other stories by Sean Michael Kerner)

Larry Ellison doesn’t care much for Amazon Web Services (AWS) as a cloud competitor. In his keynote address at Oracle OpenWorld on Oct. 22, Ellison outlined the new Gen 2 Cloud platform, making constant competitive comparisons against AWS.

A primary message from Ellison is that the Gen 2 Oracle cloud is more secure, with autonomous capabilities to help protect against attacks. Ellison also emphasized the segmentation and isolation of workloads on the Gen 2 Oracle cloud, providing improved security. The ability to easily move applications from the gen1 Oracle cloud to Gen 2 and from other clouds to Oracle, is a cornerstone of the Gen 2 Oracle cloud, Ellison said.

Complete Story

Source

Is a VPN a Necessity for Linux Users? – ThisHosting.Rocks

Let’s delve into what a VPN is and who needs one before exploring if a VPN is really necessary for Linux users.

If you want a short answer telling you if a VPN, such as Surfshark, is a necessity for Linux users – the answer is maybe. This depends on the network you are connecting to, what you will be doing online, and how important privacy is to you. We are going to help you answer these questions for yourself to determine if a VPN is a necessity.

What is a VPN?

In the simplest terms, a VPN (Virtual Private Network) is a private connection to the internet. This privacy is established by routing your internet traffic through another computer with a secure connection. Anyone watching this traffic will simply know that your computer is communicating with one other computer on the network. This keeps them from intercepting information about the websites and services you are using online.

Your system connects to the VPN service which then connects to the other services you are using online. All your internet traffic is passed through the VPN service in order to protect your anonymity on the internet.

You can either buy a VPN service through a provider like Surfshark or you can self-host your own VPN on a cloud server.

Who needs a VPN?

Even if the connection is secure and the information being sent cannot be seen, the router you connect to can see what site you access, when, and for how long. If you are working with confidential information or trade secrets, that can be very valuable information. This metadata is worth protecting and only sharing with a trusted service.

Even if you are not doing anything that needs to be kept secret, most people prefer to have their online activity remain private. For this reason alone, it is a good idea to use a VPN.

Digital Nomads & Road Warriors

The life of a digital nomad, or a road warrior, involves accessing the internet to get work done from different networks every day. Sometimes you may be on multiple different networks in a single day. These are operated by unknown parties which may, or may not, be trustworthy. This is why many digital nomads and road warriors travel with a VPN.

Work From Home Professionals

Just because you work from home, does not mean you always work at home. With the popularity of coworking spaces around the world, and the people we all see working in coffee shops, working from home often does not mean you are working at home. Obviously, you may want a little privacy when you are using the public WiFi at your local coffee shop.

Public WiFi

Not all public internet hotspots are found in coffee shops. There are many businesses which provide free internet access today and some cities. Each of these offers an opportunity to hop online and get some work done. However, without a VPN, it is possible for these services to see where you are going online – and you may not want them to have that information.

Residential ISP

Do you want your internet service provider to know what you do online all day?

Many jobs that can be done online require access to trade secrets and confidential information. Rather than letting your home internet service provider know what you do online, you can route that traffic through a VPN. This way they can only see that you are communicating with your VPN service, but don’t see what you are accessing on the other end.

Isn’t Linux more secure than Windows?

The security we are talking about with a VPN has little to do with the operating system in use. Windows and Linux both send and receive packets of data on the internet in the same way. Part of this communication involves telling other systems where the packet needs to go.

Those devices at the endpoint for this communication, the router, in this case, can collect a lot of data about where you go online, when, and for how long. It does not matter if your device is using Windows, Linux, or Mac OS to navigate the web, the packets are the same.

Do Linux users really need a VPN?

As you can see, it all depends on the network you are connecting to, what you will be doing online, and how important privacy is to you.

If you are connecting to a trusted network then you can probably operate without a VPN. However, if you don’t trust the network or don’t have enough information to know if you can trust the network, then you will want to use a VPN. As an example, do you know who has access to the information collected by the open WiFi service at your local coffee shop? Would you want them to know where you go online, when, and how long you use that service? If not, then a VPN can help secure that information while you’re using their network.

The question of what you will be doing online is just as important as the trust you place in the network. For example, there are business and personal finance tasks which you would not want to be intercepted. However, most people would not be too concerned about someone having information that shows they checked the weather forecast. What you are doing on the network can determine if a VPN is a necessity for Linux users.

Today, there are some people who have given up on the entire concept of privacy. For them, no VPN may be the way to go. However, the rest of us who value privacy should consider using a VPN just to reduce the amount of information about online activities that are being shared. Using a VPN does not allow your internet provider to see what you are doing online and that privacy can be worth the cost of a VPN for everything.

About the Author

This article was submitted to us by a third-party writer. The views and opinions expressed in this article are those of the author and do not reflect the views and opinions of ThisHosting.Rocks. If you want to write for ThisHosting.Rocks, go here.

Surfshark is a quality VPN provider that lets you control your online safety and freedom. Quick and easy to use VPN for the best online experience. Unrestricted content. Ultra-fast speed. Friendly support.

Source

Piwik Analytics on Nginx – LinuxAdmin.io

Piwik is a open source web site analytics software. It is free and opensource and can be used to track Nginx requests as well as Apache. This guide covers the Nginx configuration and installation of Pikwik. You can read more about Piwik here.

If you do not already have Nginx and PHP-PFM installed, please see the following guides

How To Install Nginx

How To Install PHP-FPM

Install Pikwik

Create a new directory to contain the piwik analytics data

mkdir /etc/nginx/stats.domain.com

Go to that directory

cd /etc/nginx/stats.domain.com

Download the latest version of Piwik

wget https://builds.piwik.org/piwik.zip

Un-compress it

unzip piwik.zip

Create a new database:

mysql -e “create database pikwik;”

Create a new database user:

mysql -e “grant all on pikwik.* to [email protected] idenfied by ‘PASSWORD’;”

Configure Nginx

Create a new server configuration for stats.domain.com

nano /etc/nginx/stats.domain.com.conf

Insert the following, updating references to stats.domain.com with your domain name.

server {
listen 80;
server_name stats.domain.com;

access_log /etc/nginx/logs/stats.domain.com_access.log;
error_log /etc/nginx/logs/stats.domain.com_error.log;

# Disable all methods besides HEAD, GET and POST.
if ($request_method !~ ^(GET|HEAD|POST)$ ) {
return 444;
}

root /etc/nginx/stats.domain.com/;
index index.php index.html;

# Disallow any usage of piwik assets if referer is non valid.
location ~* ^.+.(?:jpg|png|css|gif|jpeg|js|swf)$ {
# Defining the valid referers.
valid_referers none blocked *.domain.com;
if ($invalid_referer) {
return 444;
}
expires max;
break;
}

# Support for favicon. Return a 204 (No Content) if the favicon
# doesn’t exist.
location = /favicon.ico {
try_files /favicon.ico =204;
}

# Try all locations and relay to index.php as a fallback.
location / {
try_files $uri /index.php;
} #location ~* ^/(?:index|piwik).php$ {
location ~ .php$ {
try_files $uri =404;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;

}

# Any other attempt to access PHP files returns a 404.
#location ~* ^.+.php$ {
# return 404;
#}

# Return a 404 for all text files.
location ~* ^/(?:README|LICENSE[^.]*|LEGALNOTICE)(?:.txt)*$ {
return 404;
}

} # server

Edit the main nginx configuration file:

nano /etc/nginx/nginx.conf

Insert the following line at the end:

Include /etc/nginx/stats.domain.com.conf;

Restart nginx

service nginx restart

Once you have done that go ahead and visit stats.domain.com it will ask you to input the MySQL credentials you created earlier. After it has populated the database, it will provide you with code to insert into the site you wish to track. Once that is in place it will start generating the data for you to view inside the pikwik installation.

Jun 5, 2017LinuxAdmin.io

Source

We’re Dell Official, Y’all! – SUSE Communities

Share with friends and colleagues on social media

SUSE Linux Enterprise 15 is a major milestone for SUSE, our 1st major OS release in 5 years. With this release, customers can bridge traditional infrastructure to software defined infrastructure with key features like Modular+ architecture, unified installer, and HPC. This modern operating system will help simplify multimodal IT, make traditional IT infrastructure more efficient and provide an engaging platform for developers. As a result, organizations can easily deploy and transition business-critical workloads across on-premise and public cloud environments.

Dell’s two decades partnership with SUSE has allowed multiple levels of engineering engagement throughout the testing and development lifecycle of this operating system. And just recently, following on the heels of the GA announcement for SUSE Linux Enterprise 15, Dell has completed the certification tests for this OS version on Dell’s latest server platforms.

This certification process include a full battery of tests, from hardware compatibility tests, to testing of upgrade and migration scenarios, and the testing compatibility with OpenManage™, the Dell systems management software. By working together with SUSE on this certification process, Dell ensures that our shared customers have the best possible experience with the solution stack they have chosen through Dell and SUSE.

Dell Linux Product Manager Gordon Bookless had this to say about this latest major achievement in our partnership: “Real innovation demands IT transformation, and that often starts with modernization the infrastructure. With the release of SUSE Linux Enterprise Server 15, organizations will be able to jumpstart their modernization and transformation projects with an enterprise ready, production grade Linux platform to drive innovation and reap the rewards of open source. Through our long collaboration, Dell and SUSE have integrated SUSE Linux Enterprise Server into our ready solutions for SAP and other workloads, offering our customers open and scalable solutions for all their mission critical compute needs.”

For more information about SUSE Linux Enterprise, check out these links:

For more information on Dell’s support for SUSE OS, see:

https://www.dell.com/support/contents/us/en/04/article/product-support/self-support-knowledgebase/operating-systems/linux-operating-systems/suse-linux

Share with friends and colleagues on social media

Source

Forty-Four New Organizations Join The Linux Foundation in September, Continuing Trend of More Than a Member a Day on Average in 2018

SAN FRANCISCO – October 23, 2018 – The Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced the addition of 33 Silver members and 11 Associate members in the month of September. Linux Foundation members help support development of the shared technology resources, while accelerating their own innovation through open source leadership and participation. Linux Foundation member contributions help provide the infrastructure and resources that enable the world’s largest open collaboration communities.

Since the start of 2018, a new organization has joined The Linux Foundation every day and we are honored to be their partners in open source.

“We are thrilled to welcome forty-four new members to The Linux Foundation this month,” said Jim Zemlin, executive director, The Linux Foundation. “These organizations, which represent industries including technology, education, energy, the service industry and more, are working to create a more collaborative community in order to promote further innovation; we look forward to working with them to help make that happen.”

In addition to joining the Foundation, many of the new members have joined Linux Foundation projects such as the Cloud Native Computing Foundation, Hyperledger and LF Networking. For a full list of members, visit https://www.linuxfoundation.org/membership/members/.

Linux Foundation Silver members are organizations that contribute to or otherwise support open source communities and projects. The new Linux Foundation Silver members who joined in the month of September are:

  • Ambedded Technology is specialized in the solution of professional ARM Micro Server and integration of ARM platform into different vertical markets.
  • BetaBlocks transforms blockchain ideas into successful companies.
  • Blockchain Educators LLC is developing and cultivating concepts, partnerships, and businesses backed by blockchain technology.
  • Cardstack Syndicate Inc is an open-source framework and consensus protocol that makes blockchains usable and scalable for the mass market.
  • CenCon Blockchain Group Inc. is the first group company registered and approved by the Canadian government, whose core business is to improve the production relations by blockchain technologies.
  • CloudYuga provides training in Docker, Kubernetes, Mesos Marathon, Container Security, GO Language, Advance Linux Administration, and more.
  • Constellation Labs addresses the limitations inherent in current blockchain technology.
  • Dotscience snapshots every detail of every run of a model and enables you to visualize model behavior and optimise performance.
  • ENC Digital Technology Co., Ltd. provides marine tourism transportation and travel services in China.
  • EOS SOFTWARE, S.A. DE C.V: provides software to manage your EOS Tools.
  • The Foundry Visionmongers Limited designs creative software technologies used to deliver award-winning visual effects and 3D content for the design, visualization and entertainment industries.
  • Garden.io’s platform is designed from the ground up to make developing, managing and testing multiple services dramatically faster and easier.
  • Honeywell International invents and manufactures technologies that address some of the world’s most critical challenges around energy, safety, productivity and global urbanization.
  • Hortonworks, Inc. delivers 100 percent open-source global data management platforms and services so customers can manage the full lifecycle of their data.
  • Infinidat focuses on eliminating the compromises between performance, availability, and cost at multi-petabyte scale for enterprise storage.
  • Intelligent Systems Services design and install fire alarm and life safety systems for a wide range of customers.
  • KoreConX investor relations sets the standard for transparency, compliance and investor confidence.
  • Mobilise offers consultancy services to MVNOs and others looking to enter the telecoms industry, including strategy, business casing, feasibility study, project management, solution architecture and service operations.
  • MSys Technologies engineers the entire product development cycle-from POCs to development to testing and support.
  • New Relic allows you to easily view and analyze massive amounts of data, and gain actionable insights in real time.
  • Noris Network AG delivers individual and high-quality single-source IT services to you.
  • Omnigate offers advanced transaction management tools to a broader customer base and allows more more participants to participate in decentralized trading networks.
  • Open Cloud Foundation brings together diverse stakeholders to promote and build a more open, strong, secure and standardised cloud.
  • Pragma is a channel focused distribution company designed to support resellers and Ericsson-LG in bringing iPECS unified communications technology to the UK market.
  • Rift.IO automates the complex processes required to design, deploy, and scale virtualized network functions and services.
  • Rookout technology decouples the data visibility layer from the app, so you can review parts of the live code on demand.
  • SoftIron provides software defined storage for high performance applications enabling enterprise level IT infrastructure for business demands.
  • Solo.io helps enterprises migrate and gradually transform legacy applications to new architectures.
  • State Farm insurance helps people manage the risks of everyday life and recover from the unexpected.
  • Teuto.net Netzdienste GmbH specializes in providing hosting, cloud and web development services based on open source technologies.
  • Wallarm automates real-time application protection and security testing for websites, microservices and APIs across public and private clouds.
  • Wanchain is the world’s first and only interoperable blockchain with secure multi-party computing.
  • XSKY adds on enterprise-ready interfaces and 24/7 maintenance capability to generic Ceph, helps customers reduce total cost of ownership and solve the dilemma of data expansion versus budget restriction.

Associate members of The Linux Foundation include government agencies and not-for-profit organizations that have demonstrated a commitment to building, sustaining, and using open source technologies. The following organizations are new Linux Foundation Associate members:

  • Blender Foundation is an independent nonprofit public benefit corporation that establishes services for active users and developers of Blender.
  • CERTH (Centre for Research and Technology Hellas) is one of the leading research centers in Greece with important scientific and technological achievements.
  • Enterprise Ethereum Alliance is the industry’s first global standards organization to deliver an open, standards-based architecture and specification to accelerate the adoption of Enterprise Ethereum.
  • FIWARE is a curated framework of open source platform components to accelerate the development of smart solutions.
  • Government of Bermuda
  • Ministry of Citizens Services provides a wide range of services to British Columbians across the province.
  • Monash University is one of Australia’s leading universities and helps change lives through research and education.
  • SARAO (South African Radio Astronomy Observatory) spearheads South Africa’s activities in Square Kilometre Array Radio Telescope in engineering, science and construction.
  • University of Kassel is a vibrant university characterized by its openness to new ideas in every single area of its work.
  • Visual Effects Society is a global professional honorary society representing the full breadth of visual effects practitioners including (but not limited to) artists, technologists, model makers, educators, and producers.
  • Washington State University has inspired the next generation of problem solvers since 1890.

With the support of its members, The Linux Foundation hosts open source projects across technologies including networking, security, cloud, blockchain, and more. This collaborative development model is helping technology advance at a rapid pace in a way that benefits individuals and organizations around the world.

Note – The Linux Foundation releases a look back list of new members joining the organization every month.

About The Linux Foundation

The Linux Foundation is the organization of choice for the world’s top developers and companies to build ecosystems that accelerate open technology development and industry adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at www.linuxfoundation.org.

# # #

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage

 Source

Want To Run Linux On Android Without Rooting? Using UserLAnd

Just recently I came across a new app on Google Play Store that can help you run Linux on your existing Android smartphone. Named UserLAnd, this application is fully open source, and its code is available on GitHub.

The latest 1.0.0 version of the app follows the last beta release 0.5.3, which was under development for the past few months. So, let’s tell you what the free UserLAnd app has to offer.

You must be knowing that Android is based on a modified Linux kernel. So, it makes sense that you can use Android to run Linux commands and use tools like ssh? UserLAnd makes these things easier and lets you run Linux distros like Debian and Ubuntu.

lubuntu on userland

The major highlight of this app is that it doesn’t demand root access from you. It’s a big relief as rooting also exposes one’s device to numerous security flaws and warranty hazards. You can use it to install/uninstall apps like any other regular application.

To use the app, you can use either run single-click apps or make use of user-defined custom sessions. The second method involves defining the filesystem and services (vnc or ssh) you wish to use. After this, the app downloads the necessary files, sets up everything, and connects to the server.

You can go ahead and visit the project’s website and GitHub page to know more about it and how to use it.

Did you find UserLAnd interesting? Share your views and keep reading Fossbytes.

Also Read: La Frite Linux Mini Computer Looks Like An Ultra-affordable Raspberry Pi Alternative

Source

Tracktion’s T7 DAW is Now Free to Download on Linux

Last updated September 9, 2018

There are several good Digital Audio Workstations (DAWs) available for Linux. However, only a few of them are free to download.

Now, Tracktion’s T7 DAW became a freeware (although, not an open source software) and it is available for free to download across multiple platforms (Linux, Windows, and Mac).

FYI, Tracktion T7 is not open source software.

If you’ve kept tabs on what Tracktion’s up to, you must be knowing that its previous version (T6) was available for free while the T7 with all the features that a full-fledged DAW offers was in development. And, they decided to make it available for free.

With the inclusion of T7 DAW as one of the free DAWs available for Linux, some of the most popular DAWs like Reaper or Bitwig might offer a free version for Linux – to compete with. But, that’s just something to expect – without any facts stated.

T7 DAW Features

While there’s a lot of features to talk about in a DAW, but let us take a quick look on what the T7 DAW has to offer.

Overview Of Features:

  • Cross-platform support (Mac, Windows, and Linux)
  • VST/AU/Linux VST plugin support
  • Unlimited Audio/MIDI tracks
  • Automation Tools
  • Video Sync
  • Latency Management
  • Step Sequencer
  • Warp Time
  • Clip Layer Effects
  • LFO Generators
  • Freeze Point Technology
  • Pitch Fades

Have a look at the features of Tracktion T7 in this video:

If you think that the features offered is absolutely crazy, you should check out the system requirements before downloading it. As per their official site, it has been tested with Ubuntu 16.04 and recommends you a core i5 processor coupled with a minimum of 4 GB RAM.

No matter what project you are working on, the T7 DAW will definitely prove to be a useful Digital Audio Workstation without requiring you to spend any money.

Of course, there would be some features (like Melodyne Essential) that will be unavailable when compared to Tracktion’s Waveform 9 – which is a paid DAW.

Unless you’re an advanced music creator (or sound designer), you can do almost anything with Tracktion’s free T7 DAW.

You can get T7 DAW from the link below by registering an account with them.

What do you think about Tracktion’s T7 DAW? Is it good enough for you? Do you think that the paid ones might end up offering a free version for Linux as well?

Let us know your thoughts in the comments below.

About Ankush Das

A passionate technophile who also happens to be a Computer Science graduate. He has had bylines at a variety of publications that include Ubergizmo & Tech Cocktail. You will usually see cats dancing to the beautiful tunes sung by him.

Source

Nginx Browser Caching – LinuxAdmin.io

Nginx Browser Caching

You can use Nginx to Set cache expiration times to leverage browser caching for the user requesting specific file types. This will cause the browser to retain the downloaded image until the length of the expires header. This will cause faster page time loads on each subsequent request performed by the end user.

Prerequisites:

You will need to have Nginx already installed. If you do not please see How To Install Nginx From Source On CentOS

How To Check Current Nginx Browser Cache Behavior

Check headers on a existing image:

$ curl -Is http://domain.com/test.png

HTTP/1.1 200 OK
Server: nginx/1.11.13
Date: Thu, 08 Jun 2017 01:26:19 GMT
Content-Type: image/png
Content-Length: 6983
Last-Modified: Fri, 28 Apr 2017 02:21:20 GMT
Connection: keep-alive
ETag: “5902a720-1b47”
Expires: Thu, 15 Jun 2017 01:26:19 GMT
Cache-Control: max-age=604800
Accept-Ranges: bytes

If you look at the request, It is missing a Expires header. This is what needs to be in place to tell the browser to not check the file again until the cache time has expired.

Nginx Browser Cache Configuration

To implement this, you will need to edit your Nginx server configuration

nano /etc/nginx/nginx.conf

Inside the server{} you will want to place the file types you would like to cache and their respective times:

location ~* .(png|jpg|jpeg|gif|ico|otf)$ {
expires 1y;
}
location ~* .(js|css)$ {
expires 7d;
}

The first block says for the following file types .png, .jpg, .jpeg, .gif, .ico, and .otf you want to cache them for a period of 1 year. The second location indicates you want to cache .js and .css files for a period of 7 days. You can add other file types or adjust the time as you see first based on the needs of your site. Once that has been completed, go ahead and save the file.

Once that has been completed you will want restart Nginx to save the new configuration:

service nginx restart

You can then go ahead and check the same image as you did before adding the headers

$ curl -Is http://domain.com/test.png

HTTP/1.1 200 OK
Date: Thu, 08 Jun 2017 01:25:57 GMT
Content-Type: image/png
Content-Length: 6983
Connection: keep-alive
Last-Modified: Fri, 28 Apr 2017 02:21:20 GMT
ETag: “5902a720-1b47”
Expires: Fri, 25 May 2018 23:17:30 GMT
Cache-Control: max-age=31536000

You now see the Expires header and the time which should correspond with the length of the header you set on the file type. This will improve page speed for end users when they make multiple similar requests and it will also reduce overall server consumption per page request as well.

Jun 7, 2017LinuxAdmin.io

Source

WP2Social Auto Publish Powered By : XYZScripts.com