Categories
Articles Blog

Container Platform Comparison: Kubernetes, Docker, and Mesos

Containerization a great way for businesses to deploy and manage their applications. It makes it easy to move applications between different environments. It also makes sure everything stays consistent across development, testing, and production environments. But there are many different container platforms to choose from, which makes it difficult to know which one is best for your business. In this blog post, we’ll take a closer look at three popular container orchestration platforms: Kubernetes, Docker, and Mesos. We’ll explore the key features of each platform, their pros and cons, and provide guidance on when to use each one.

The platforms

Kubernetes

What is Kubernetes?

Kubernetes is an open-source container orchestration platform that was originally developed by Google. It allows businesses to deploy, scale, and manage containerized applications in a clustered environment. With Kubernetes, you can run multiple containers as a single unit, called a pod, and those pods can be deployed on a cluster of machines.

Key Features of Kubernetes

  • Automatic scaling of pods
  • Self-healing
  • Rolling updates
  • Service discovery
  • Powerful and flexible API for automation and integration with other tools.

Pros and Cons of Kubernetes

Pros:
  • High level of scalability and availability
  • Easy to manage, update, and roll out new features and bug fixes
  • Widely used and supported by major cloud providers
  • Large and active community
Cons:
  • Steep learning curve
  • Complex to set up and manage

When is Best to Use Kubernetes

  • When you have a large number of containers to manage
  • When scalability and high availability are important
  • When you are running your applications in the cloud
  • When you want to use a widely adopted and supported platform

Docker

What is Docker?

Docker is a platform for developing, shipping, and running distributed applications. It uses containerization technology to package an application and its dependencies together in a single container, which can be run on any machine that has Docker installed. This allows for consistency and reproducibility across different environments.

Key Features of Docker

  • Support for both Windows and Linux containers
  • A built-in container registry (Docker Hub)
  • An intuitive command line interface
  • Large community and ecosystem

Pros and Cons of Docker

Pros:
  • Simple to understand and use
  • Large community and ecosystem
  • Great for development and testing
  • Lightweight and portable
Cons:
  • Limited scalability and availability features
  • Limited support for complex multi-container applications

When is Best to Use Docker

  • When you want something straightforward and user-friendly
  • When you are primarily focused on development and testing

Mesos

What is Mesos?

Mesos is an open-source program that helps you organize and run multiple apps on a collection of computers. It makes it simple to share things like memory and storage between apps, and it can also automatically move apps to new machines if something goes wrong. It’s maintained by the Apache Software Foundation.

Key Features of Mesos

  • Dynamic resource sharing and isolation
  • High scalability and performance
  • Fault-tolerance and high availability

Pros and Cons of Mesos

Pros:
  • Dynamic resource sharing and isolation
  • High scalability and performance
  • Support for a wide range of frameworks and applications
Cons:
  • Complex to set up and manage
  • Limited community and ecosystem compared to Kubernetes and Docker

When is Best to Use Mesos

  • When you need a highly scalable and fault-tolerant platform
  • When you need dynamic resource sharing and isolation
  • When you are already using or plan to use a framework that is supported by Mesos

Which container platform should you use?

When it comes to choosing a container platform, the decision ultimately comes down to the specific needs of your organization and application. Let’s look at the use cases once more, side-by-side:

Kubernetes:
  • Good for running + managing large-scale, complex workloads
  • Has support for a wide range of frameworks + languages
  • Good for when you’re running applications in the cloud
Docker:
  • Good for running + managing small-medium sized workloads
  • Good for development + testing environments
  • Is simple + easy to use
Mesos:
  • Good for running + managing large-scale, fault-tolerant workloads
  • Has dynamic resource sharing and isolation

Of course, the best fit for your company might vary. But, no matter which you choose, Electric Pipelines is here to help. Whether you decide to go with Kubernetes, Docker, or Mesos, we’ve got you covered. We can work with you to make sure it’s set up and running smoothly. Don’t hesitate to reach out to us so we can chat more about how we can help you with your container orchestration needs.

Conclusion

A container platform is a powerful tool that can help you manage your containerized applications. Kubernetes, Docker, and Mesos are all great options, each with its own pros, cons, and best use cases. Kubernetes is the most popular and widely used, Docker is great for simple and easy-to-use needs, and Mesos for high scalability and performance. It’s important to weigh the options and choose the one that best fits your needs. And don’t hesitate to reach out to us at Electric Pipelines for help with containerization.

* indicates required

Recent Posts

  • Why People Will Be Disappointed by GPT4
    Though Open AI has been on the market since 2020, last November, GPT-3 changed the world. When most people discovered it, they were blown away by all the challenging tasks it could handle for you. From business tasks, like automating customer service, generating high-quality content, or building a chatbot to creative endeavors like writing, drawing,…
  • Call of Duty should stop innovating
    The series’ biggest successes don’t come from innovative ideas, but old ones done well. Call of Duty lost its way. Call of Duty is one of the oldest franchises in gaming. After Call of Duty 4: Modern Warfare, Activision began releasing new Call of Duty (COD) games every year. That makes 15 games in 15 years.…
  • Six companies used to rule gaming. Only two of them still exist.
    Photo Credit: Jason from The Wasteland Titans in Gaming Part 1: The Old Titans I found a series of articles in Computer Gaming World from the late eighties talking about the “Titans of Gaming.” They covered what they considered to be the five most important game producers. Of the five, two names may be familiar: Electronic…
  • The real story behind the Activision-Blizzard acquisition drama
    Sony has a lot to fear from the Activision-Blizzard acquisition, and it has little to do with Call of Duty A business move has dominated gaming news for the last month. Not new game announcements, or a new console, or tech or a service. We’ve been caught up in the drama around a business deal and…
  • Nintendo’s Godfather: Winners in Gaming 2
    “I tell people that ‘entertainment is valuable when it is different from other entertainment,’ and these are Yamauchi’s words. It was Yamauchi who laid the foundation of our universal way of thinking and the foundation of Nintendo today.” — Current Nintendo president Shuntaro Furukawa Nintendo has only had three presidents since it became a gaming company. They’ve…
Categories
Articles Blog

The Importance of Automated Testing in CI/CD

Automated testing is a crucial aspect of software development. It’s a vital part of the process that helps ensure that the code you’re putting out there is high quality, and without any hidden surprises. And what does automated testing have to do with CI/CD? In this post, we’ll answer that question, and a few more: what is CI/CD? What is automated testing? And, how can you utilize automated testing in CI/CD?

What is CI/CD?

Continuous Integration and Continuous Deployment, or, CI/CD, is all about making the development process as efficient and streamlined as possible. It’s a process where you’re constantly integrating new code changes into your main codebase, and then deploying those changes to production. The typical workflow of CI/CD includes:

  • Writing code
  • Committing it to a repository
  • Building the code
  • Running automated tests
  • Deploying the code to production

Automating this process is a priority to ensure it runs quickly and with minimal human involvement. This way, you’re able to catch any issues early on and get new features to customers faster.

What is automated testing?

Automated testing is the process of using software to run tests automatically on an application. It involves writing test scripts that can execute automatically with minimal human intervention.

There are different types of automated tests you can use, like:

  • Unit tests: these tests focus on individual units of code, such as functions or methods, to ensure that they are working correctly.
  • Integration tests: these tests focus on the integration of different units of code, such as different modules or components, to ensure that they are working together correctly.
  • Acceptance tests: these tests focus on testing the overall functionality of the application to ensure that it meets the requirements and expectations of the customer.
  • Regression tests: these tests focus on ensuring that code changes do not break existing functionality.

Automated testing helps ensure that the code you’re putting out there is high-quality and bug-free. It’s efficient, it reduces human error and it improves the reliability of your software. With automated testing, you’re able to run tests quickly and without needing human intervention, which helps to catch any issues early on in the development process. This way you can be confident that your software is working as it should, and that your customers are getting the best experience.

How does automated testing fit into CI/CD?

You can seamlessly integrate automated testing into the CI/CD workflow. You can do this by running the tests automatically as part of the build process. This makes sure that any code changes that break existing functionality are detected early. It also helps to identify any bugs or issues before the code is deployed to production.

The process typically involves integrating automated testing tools into the build pipeline. This way, tests are run automatically each time new code is committed to the repository. This allows for fast feedback on code changes and ensures that the software is functioning as intended.

The importance of automated testing in CI/CD cannot be overstated. It helps ensure that the software being deployed is of high quality by:

  • Finding and fixing any issues early on
  • Catching bugs and problems before they become bigger headaches
  • Keeping the software reliable
  • Making sure the software meets the customer’s needs and expectations.

Overall, it’s a key step in making sure the software is the best it can be for the customer.

How can I utilize this?

Photo by Sai Kiran Anagani on Unsplash

Electric Pipelines

We understand that testing can be a tedious task, but we’re here to change that. Our team can help you set up automated testing in your CI/CD pipeline, so you can catch any issues early and deploy your software with confidence.

One of our key advantages is our use of AI technology, Psyborg, which helps us automate and improve our work processes for optimal results. Our experts will work with you to understand your specific testing needs and choose the best strategies for your project. We’ll then integrate these tests into your CI/CD pipeline, so you can rest easy knowing that any issues will be caught early on.

Conclusion

Automated testing is a vital component of any CI/CD pipeline. It ensures that:

  • Code is functioning as intended
  • Bugs and issues are identified and addressed early on
  • The software development process is streamlined

At Electric Pipelines, we specialize in setting up automated testing for your CI/CD pipeline. Our team of experts utilizes AI technology, Psyborg, to ensure the best possible results. If you’re looking to improve your testing process and gain more efficiency, we can help.

By choosing Electric Pipelines, you’ll be able to streamline your development process and focus on what you do best. So, don’t hesitate to reach out to us and let us help you take your software development to the next level.

* indicates required

Recent Posts

  • Why People Will Be Disappointed by GPT4
    Though Open AI has been on the market since 2020, last November, GPT-3 changed the world. When most people discovered it, they were blown away by all the challenging tasks it could handle for you. From business tasks, like automating customer service, generating high-quality content, or building a chatbot to creative endeavors like writing, drawing,…
  • Call of Duty should stop innovating
    The series’ biggest successes don’t come from innovative ideas, but old ones done well. Call of Duty lost its way. Call of Duty is one of the oldest franchises in gaming. After Call of Duty 4: Modern Warfare, Activision began releasing new Call of Duty (COD) games every year. That makes 15 games in 15 years.…
  • Six companies used to rule gaming. Only two of them still exist.
    Photo Credit: Jason from The Wasteland Titans in Gaming Part 1: The Old Titans I found a series of articles in Computer Gaming World from the late eighties talking about the “Titans of Gaming.” They covered what they considered to be the five most important game producers. Of the five, two names may be familiar: Electronic…
  • The real story behind the Activision-Blizzard acquisition drama
    Sony has a lot to fear from the Activision-Blizzard acquisition, and it has little to do with Call of Duty A business move has dominated gaming news for the last month. Not new game announcements, or a new console, or tech or a service. We’ve been caught up in the drama around a business deal and…
  • Nintendo’s Godfather: Winners in Gaming 2
    “I tell people that ‘entertainment is valuable when it is different from other entertainment,’ and these are Yamauchi’s words. It was Yamauchi who laid the foundation of our universal way of thinking and the foundation of Nintendo today.” — Current Nintendo president Shuntaro Furukawa Nintendo has only had three presidents since it became a gaming company. They’ve…
Categories
Articles Blog

How to Conduct a Cybersecurity Risk Assessment for Your Organization

Imagine this: You’re sitting at your desk on a Friday morning, finally able to relax after a long week of hard work. But it doesn’t last for long. Your phone starts ringing off the hook with panicked team members telling you that the company has suffered a security breach. Chaos ensues and you find yourself facing yet another long, grueling weekend at the office trying to fix everything. But it doesn’t have to be this way. By conducting a cybersecurity risk assessment and implementing a risk management plan, you can protect your company’s assets and minimize the risk of an attack. Don’t let a security breach ruin your weekend plans. Take action to keep your business safe and secure, so you can enjoy your weekends like the boss you are.

Conducting a risk assessment is a must for any organization. By prioritizing and addressing the most significant risks, you can effectively allocate resources and comply with legal requirements. Plus, it’s a key part of any risk management strategy.

This can seem like a daunting task, but it can be broken down into three main steps: 1. identifying your assets, 2. analyzing the threats and evaluating the vulnerabilities, and 3. creating a risk management plan. Let’s go through each of these steps in more detail.

Step 1: Identify Your Assets

Photo by Mediamodifier on Unsplash

What is an asset?

In this context, assets are the items that need to be protected. These could include data, systems, networks, and devices. Essentially, any piece of information or technology that is important to your organization’s operations is an asset.

Some common examples of assets that might need to be protected include:

  • Sensitive customer data
  • Financial records
  • Intellectual property
  • Critical business systems

This could include everything from your company’s financial records, to your employees’ login credentials, to your proprietary software.

A thorough identification of all assets is crucial for a successful cybersecurity risk assessment. If you miss any vulnerabilities, it could lead to a cyber attack. To avoid this, make sure to involve IT staff, business owners, and other relevant employees in the identification process. This will allow you to cover all your bases, from the most sensitive data to the less noticeable assets. Don’t let cyber attackers slip through the cracks!

Step 2: Analyze the Threats and Evaluate the Vulnerabilities

Photo by Marek Piwnicki on Unsplash

What is a threat?

In the second step, you will need to consider the various types of threats that could potentially compromise your assets. These could include both external threats and internal threats like:

  • Hackers
  • Malware
  • Accidental data breaches caused by clumsy employees tripping over cords and knocking over servers
  • Rogue employees who are secretly working for the competition and trying to steal all of your company’s trade secrets

And those are just a few examples! The point is, you never know what kind of threats are out there waiting to attack your assets. Better to be prepared for everything than to be caught off guard by something sneaky and unexpected. It is important to consider the full range of potential threats, as this will help you to identify and address vulnerabilities that might be exploited by these threats.

What is a vulnerability?

A vulnerability is a weakness in a system or process that could be exploited by a threat. Some common examples of vulnerabilities that might be identified in a cybersecurity risk assessment include:

  • Unpatched software
  • Weak passwords
  • Lack of employee training or policies
  • Outdated software
  • Insecure configurations
  • Lack of network segmentation
  • Lack of access controls
  • Lack of encryption
  • Lack of multi-factor authentication
  • Lack of physical security measures
  • Lack of incident response plans
  • Lack of monitoring and detection systems

These are just a few examples, and the specific vulnerabilities that might be identified will depend on the specific systems and processes in place at an organization. These can leave your systems and data at risk of exploitation by cyber attackers.

It’s crucial to prioritize the threats that are most likely to exploit vulnerabilities and cause the most damage. At Electric Pipelines, we can help you identify vulnerabilities through security scans, and interpret the assessments to help you prioritize your efforts in addressing them. By prioritizing your efforts, you can ensure that you’re addressing the vulnerabilities that are most likely to be exploited by threats and that have the potential to cause the greatest impact. It’ll save you time, money, and maybe even a few gray hairs.

Step 3: Create a Risk Management Plan

Photo by Scott Graham on Unsplash

In step 3, it’s time to put together a risk management plan. This plan should include all the steps you’ll take to protect your assets from the threats you identified in step 2. You’ll also want to prioritize and mitigate high-risk assets, because let’s face it, nobody wants to be the one who gets hacked and has to explain to the boss why their sensitive data is all over the dark web.

When it comes to prioritizing and mitigating high-risk assets, we’ve got you covered. Along with security scans and interpretation of security assessments, we can also help to mitigate security flaws. We can identify vulnerabilities and take steps to fix them, reducing the likelihood of a successful attack. And hey, who doesn’t love feeling secure and confident in their company? It’s like a warm hug, but for your data. Our team of experts is here to help you protect your assets and minimize the risk of a security breach. Contact us today to learn more about how we can help.

By working with us, you can be sure your organization’s assets are protected and your risk of a cyber attack is minimized – all without hiring additional staff. Protecting your business is an important investment, and we’re here to help you make the most of it. Plus, it’s always good to have a team of experts on your side – especially when it comes to keeping your business safe and secure. Imagine all the high fives and fist bumps you’ll be giving out once your risk management plan is in place!

Conclusion

Before we finish up, let’s do a quick recap of the steps you can take:

  • Identify your assets and get a good sense of everything that needs protecting, from your most sensitive data to your low-key servers.
  • Analyze threats and evaluate vulnerabilities to understand the potential risks to your assets.
  • Determine the risks and create a risk management plan to prioritize your efforts and allocate resources effectively, ensuring that you’re addressing the most significant risks first.

If you’re ready to take your security to the next level, don’t be shy! Give us a shout at Electric Pipelines. Our team is ready and willing to help you protect your assets and fend off those pesky cyber attacks. Whether you need security scans, interpretation of security assessments, or help mitigating flaws, we’ve got your back. Don’t wait until it’s too late – contact us today to get started.

* indicates required

Recent Posts

  • Why People Will Be Disappointed by GPT4
    Though Open AI has been on the market since 2020, last November, GPT-3 changed the world. When most people discovered it, they were blown away by all the challenging tasks it could handle for you. From business tasks, like automating customer service, generating high-quality content, or building a chatbot to creative endeavors like writing, drawing,…
  • Call of Duty should stop innovating
    The series’ biggest successes don’t come from innovative ideas, but old ones done well. Call of Duty lost its way. Call of Duty is one of the oldest franchises in gaming. After Call of Duty 4: Modern Warfare, Activision began releasing new Call of Duty (COD) games every year. That makes 15 games in 15 years.…
  • Six companies used to rule gaming. Only two of them still exist.
    Photo Credit: Jason from The Wasteland Titans in Gaming Part 1: The Old Titans I found a series of articles in Computer Gaming World from the late eighties talking about the “Titans of Gaming.” They covered what they considered to be the five most important game producers. Of the five, two names may be familiar: Electronic…
  • The real story behind the Activision-Blizzard acquisition drama
    Sony has a lot to fear from the Activision-Blizzard acquisition, and it has little to do with Call of Duty A business move has dominated gaming news for the last month. Not new game announcements, or a new console, or tech or a service. We’ve been caught up in the drama around a business deal and…
  • Nintendo’s Godfather: Winners in Gaming 2
    “I tell people that ‘entertainment is valuable when it is different from other entertainment,’ and these are Yamauchi’s words. It was Yamauchi who laid the foundation of our universal way of thinking and the foundation of Nintendo today.” — Current Nintendo president Shuntaro Furukawa Nintendo has only had three presidents since it became a gaming company. They’ve…
Categories
Articles Blog

What is Containerization? Exploring the Basics

Containerization is a powerful tool for software development that has gained widespread popularity in recent years. Essentially, containerization is a way to package an application, along with its dependencies, into a single container that can run consistently across different environments. By using containers, developers can create portable, scalable, and isolated applications that can be easily deployed and managed. It’s a game changer for software development as it makes it more efficient, reliable and cost-effective. In this post, we will explore the basics of containerization, its key concepts and benefits, and how it can revolutionize software development.

Now that we have a better understanding of what containerization is, let’s delve deeper into how it works in the next section.

Photo by Emre Karataş on Unsplash

What is a container?

When it comes to running an application, there are a couple of ways to do it. You could use a virtual machine, which is like creating a separate computer within your computer. This approach allows you to run multiple operating systems on the same computer, but it can be quite resource-intensive.

Another way is to use containers, which allow you to package an application and its dependencies together, so it can run consistently on any environment.

What’s the difference?

While both technologies allow you to run software in a virtual environment, containers are the lighter and more efficient option. They allow you to run multiple applications in the same environment without having to create multiple virtual machines, making them the perfect partner for a streamlined software development process. And with the help of container orchestration tools, you can easily spin up and scale applications like a pro, ensuring your applications are always running smoothly, no matter how many users you have. With the help of container orchestration tools, you can easily manage and scale your applications in response to changing user demands.

Another way containers differ from virtual machines is in how they use their resources. Virtual machines are like creating a separate computer within your computer. They let you run different types of operating systems on the same machine, but they use up a lot of resources. And for each virtual machine, you need to install a separate operating system. Containers are different, as they don’t need their own operating system, so they start up faster and use fewer resources. Plus, they keep different parts of the application separate at a more detailed level than virtual machines.

Using containerization makes it easier to package and run your applications. It’s way more efficient and effective than using virtual machines, plus you get extra benefits like improved portability, scalability, and isolation when compared to virtual machines.

Popular Container Apps

There are a few different container technologies out there, but some of the most commonly used are Docker, Kubernetes, and Apache Mesos. Let’s take a look at each one in more detail.

Docker

Docker is a popular containerization platform that makes it easy to create, deploy, and run applications in containers. It provides a command-line interface that simplifies the process of working with containers, making it easy for developers to create and manage containers for their applications. Docker also provides an ecosystem of tools, such as the Docker Hub, which is a centralized repository for storing and sharing container images.

Kubernetes

Kubernetes is an open-source container orchestration system that automates the deployment, scaling, and management of containerized applications. It was originally developed by Google, and has since become one of the most popular container orchestration platforms. Kubernetes provides a powerful set of features for scaling and managing containerized applications, making it a popular choice for running large-scale, production-grade applications. Our team has a deep understanding of this technology and can assist in effectively utilizing it to meet the needs of your organization. We can containerize your applications, and even migrate them to the cloud.

Apache Mesos

Apache Mesos is another open-source container orchestration system, like Kubernetes. It aims to simplify the scaling and management of applications. Mesos provides a simple and powerful API for scheduling and managing containers. It’s often used in combination with other container orchestration and scheduling frameworks, such as Marathon, to manage large-scale container deployments.

Benefits of Containerization

Photo by Charlota Blunarova on Unsplash

Containerization offers numerous advantages for software development and deployment, including increased portability, isolation, scalability and efficiency. These benefits enable organizations to package and distribute applications more easily, run them in different environments and handle changing user demands. Let’s look at each one of these benefits more closely and see how they can be applied in practice.

Portability

Containers provide a consistent way to package and distribute applications, making it easy to move them between different environments, such as from development to production. This helps organizations to more easily test and deploy their applications, reducing the time and effort required to make them production-ready. Containers also make it easier to run applications in different environments, whether it’s on-premise, in a public or private cloud.

Isolation

Containers provide a level of isolation between different parts of an application, which helps to prevent conflicts and makes it easier to manage dependencies. This makes it more convenient for developers to create and test applications, and for operations teams to deploy and manage them. Containers also allow multiple applications to run on the same host without interfering with one another, which can help you make the most of your resources and cut down on expenses.

Scalability

Containers orchestration tools like Kubernetes make it easy to scale applications up or down, depending on user demand. This helps ensure that applications are always running smoothly. It also allows you to be more responsive to changing business needs and handle high loads during peak usage times. Container orchestration makes it easy to deploy, scale and manage containerized applications, which helps to minimize the effort and cost associated with scaling applications.

Efficiency

Containerization can increase efficiency in the software development process in a few ways. For example, by packaging an application and its dependencies together in a container, developers can ensure that the application will run consistently across different environments. This eliminates the need for developers to spend time troubleshooting environmental issues and allows them to focus on writing code. Additionally, using tools like Kubernetes can streamline the development and deployment process, reducing the time and effort required to get applications to a production-ready state.

Conclusion

As we’ve seen above, containerization is a powerful technology. To reiterate, there are many reasons to use containers. These include increased:

  • portability
  • isolation
  • scalability
  • efficiency

These benefits make it easier to test, deploy, and scale applications. All while reducing costs, and increasing resource utilization.

With all these benefits in mind, it’s clear that containerization can bring significant advantages to any organization. And that’s where Electric Pipelines comes in. Our team has a deep understanding of containerization and can containerize your applications for you. Don’t miss out on the benefits of containerization. Contact us today to get started.

* indicates required

Recent Posts

  • Why People Will Be Disappointed by GPT4
    Though Open AI has been on the market since 2020, last November, GPT-3 changed the world. When most people discovered it, they were blown away by all the challenging tasks it could handle for you. From business tasks, like automating customer service, generating high-quality content, or building a chatbot to creative endeavors like writing, drawing,…
  • Call of Duty should stop innovating
    The series’ biggest successes don’t come from innovative ideas, but old ones done well. Call of Duty lost its way. Call of Duty is one of the oldest franchises in gaming. After Call of Duty 4: Modern Warfare, Activision began releasing new Call of Duty (COD) games every year. That makes 15 games in 15 years.…
  • Six companies used to rule gaming. Only two of them still exist.
    Photo Credit: Jason from The Wasteland Titans in Gaming Part 1: The Old Titans I found a series of articles in Computer Gaming World from the late eighties talking about the “Titans of Gaming.” They covered what they considered to be the five most important game producers. Of the five, two names may be familiar: Electronic…
  • The real story behind the Activision-Blizzard acquisition drama
    Sony has a lot to fear from the Activision-Blizzard acquisition, and it has little to do with Call of Duty A business move has dominated gaming news for the last month. Not new game announcements, or a new console, or tech or a service. We’ve been caught up in the drama around a business deal and…
  • Nintendo’s Godfather: Winners in Gaming 2
    “I tell people that ‘entertainment is valuable when it is different from other entertainment,’ and these are Yamauchi’s words. It was Yamauchi who laid the foundation of our universal way of thinking and the foundation of Nintendo today.” — Current Nintendo president Shuntaro Furukawa Nintendo has only had three presidents since it became a gaming company. They’ve…