ADIUM TECH
  • Contact
    • Social >
      • Facebook
      • Twitter - Adium
      • Instagram - Adium
      • Twitter - Daniel Burns
      • Instagram - Daniel Burns
      • Twitter - Jacob Byerline
      • Instagram - Jacob Byerline
      • Youtube
      • NoFlyZone
    • Amazon Home Services
    • Yelp
  • About
    • Our Team
    • Reviews
  • Pricing
  • Services
    • Computer Repair
    • TV
    • Internet
    • Phone
    • Security
    • Home Automation
    • Home Theater and TV
    • Custom Computers
  • Projects
    • Mataguay
  • Blog
  • Links
    • YouTube
    • Merch
    • Speedtest
    • TeamViewer
    • CCleaner
    • Carbon Copy Cloner
    • Clean My Mac
    • Memory Clean
    • VLC Player

Adium Tech Owner's Blog

An occasionally infrequent tech blog covering all things tech in our ever evolving digital world.

Autonomous Vehicles: The Safer Option of the Future

11/22/2022

Comments

 
Follow Daniel Burns on Twitter, @DBurnsOfficial.
Comments

Risk Assessment: IoT Security

5/10/2022

Comments

 
A risk assessment on the current state of IoT (Internet of Things) security and what needs to be done moving forward.
Follow Daniel Burns on Twitter, @DBurnsOfficial.
Comments

2021 Cyber Policy Pitch Proposal

12/15/2021

Comments

 
Written by Daniel Burns
The following article is a final project completed for a Cyber Warfare class at San Diego State University. A video pitch paired with a written primer on the need for updated cyber defense policy within the United States Government.
Picture
Video:
PDF Proposal:
Follow Daniel Burns on Twitter, @DBurnsOfficial.
Comments

Information Systems Solution Plan: Tesla Inc. - PR and Customer Service

4/27/2021

Comments

 
Written by Daniel Burns
Picture
Contents:
  • Company Overview
  • Strategic Organizational Goals
  • Technology Assessment
  • Technology Goals
  • Information Systems Guidelines
  • SWOT Analysis
  • IS Strategic Initiative
  • References
Company Overview
        Tesla is first and foremost a technology company that also develops electric cars and solar energy products. They produce the Tesla Model S, 3, X, and Y as well as solar panels and the Tesla Powerwall, a home battery backup system. All of their products work together in the Tesla ecosystem to deliver a seamless, all in one, company experience paired with the Tesla mobile app. Their structure and approach to products is very similar to Apple in that they attempt to build and produce every single component to their spec all the way from software to hardware. In a way, Tesla is a bigger software company than others may realize with their Autopilot technology, industry leading car user interface, and energy production in the Tesla app. Plus their upcoming FSD (full self-driving) Beta program is showing very promising results and will be the future of transportation.
​
Strategic Organizational Goals
         Tesla’s mission is to accelerate the worlds transition to sustainable energy. To do this they launched a master plan to help obtain this goal. They started with the Roadster in 2008, a high priced, low volume production, all electric car that was fun to drive and proved their initial concept that electric cars don’t have to be boring. It had to be expensive to fund the research and production of their next car, the Model S. The Model S is a premium, four door, hatchback style sedan that has a lower price point and large electric range. Part two of the Model S plan introduced a larger electric SUV called Model X, which addressed the other half of the premium car market. Both cars still cost a lot, and this was once again to cover new production technique costs and research and development for their next affordable electric car. The Model 3 was introduced in 2017 as a mass marketed, high volume production, affordable car. Starting at $35,000 it opened the floodgates setting the new standard for the electric car market just as they did before with the S and X. It has over 300 miles of range and built using an almost fully automated and streamlined production line.
        Now, all their cars are produced using a casting process with the world’s largest casting machine saving both time and production costs. Not only do their electric vehicles help achieve the companies mission statement, but their solar energy products do as well. You need a green grid to live our lives with and charge our cars on. So, Tesla builds their own solar panels at their Buffalo, New York Gigafactory and produce batteries at their Sparks, Nevada Gigafactory for their Powerwall home battery packs as well. This allows homes to be totally independent from the grid if needed (and in blackouts). Putting free energy from the sun right into the battery pack to use to charge the car and power the house. It also helps relieve some of the stress off the grid, especially during peak times. Tesla also has large scalable battery packs that replace substations called the MegaPack. This system allows mass energy storage for local utilities stress relief off the grid. Paired with a solar farm, you can have a whole city running off solar energy 24/7.
        Megapack’s are also scalable to smaller sizes to cover large buildings instead of needing many Powerwalls in a series. Part two of the Master Plan is expanding into all forms of ground transportation starting with the Tesla Semi Truck and Cybertruck pickup truck. Adding over 500 miles of electric range to areas of the market that drastically need a shift to green energy. Tesla is also going back to their roots and reviving the Roadster to be the supercar that beats all supercars with a range of 620 miles and a starting 0-60 time of just 1.9s, all for $200,000. An affordable tag in a high-priced supercar market. Their goal is to disrupt every aspect of the transportation market and create a fun and compelling alternative to shift people’s opinions on electric vehicles. Thus, to accelerate the transition to sustainable energy.

Technology Assessment
        As a tech company, Tesla is nothing but technology focused. From advanced cutting-edge tech found in their cars to leading production processes in the factory. Tesla’s focus on technology can be found in four categories. Battery and powertrain, vehicle control and infotainment software, self-driving development, and energy generation and storage. With battery and powertrain Tesla has designed their own proprietary powertrain system to be adaptable, efficient, reliable and cost-effective while withstanding the rigors of an automotive environment. Tesla offers dual motor powertrain vehicles, which use two electric motors to maximize traction and performance in an all-wheel drive configuration and are introducing vehicle powertrain technology featuring three electric motors for further increased performance.
        As far as battery development, they maintain extensive testing and R&D capabilities for battery cells, packs and systems, and have built an expansive body of knowledge on lithium-ion cell chemistry types and performance characteristics. In order to enable a greater supply of cells for their products with higher energy densities at lower costs, Tesla is currently using their expertise to develop a new proprietary lithium-ion battery cell and improved manufacturing processes. Vehicle control and infotainment software is a major part of the customer experience as it’s the main interaction between the user and the vehicle. The performance and safety systems of our vehicles and their battery packs require sophisticated control software.
      Control systems in their vehicles optimize performance, customize vehicle behavior, manage charging and control all infotainment functions. Tesla develops almost all of their software, including most of the user interfaces, internally and update their vehicles’ software regularly through over-the-air updates. Self-driving development is coming along fairly quickly as this category will become one of the defining factors of Tesla’s long-term success. If it hasn’t already done so yet. Tesla has expertise in developing technologies, systems and software to enable self-driving vehicles using primarily vision and radar-based sensors.
      The FSD Computer runs Tesla’s neural networks in their vehicles, and they are also developing additional computer hardware to better enable the massive amounts of field data captured by vehicles to continually train and improve these neural networks for real-world performance. Currently, Tesla offers users certain advanced driver assist systems under Autopilot and FSD options. It is extremely important to note the driver is ultimately responsible for controlling the vehicle, and Tesla’s systems provide safety and convenience functionality that relieves drivers of the most tedious and potentially dangerous aspects of road travel much like the systems that airplane pilots use, when conditions permit. As with other vehicle systems, Tesla improves these functions in their vehicles over time through free over-the-air updates.
          Finally, energy generation and storage offer customers an extension of the Tesla ecosystem to fully integrate a sustainable energy lifestyle. They leverage many of the component-level technologies from their vehicles in their energy storage products. By taking a modular approach to the design of battery systems, Tesla can optimize manufacturing capacity among energy storage products. With the expertise in power electronics, it enables them to interconnect battery systems seamlessly with electrical grids while providing fast-acting systems for power injection and absorption.
          Tesla has also developed software to remotely control and dispatch energy storage systems using their real-time energy trading platform for larger scale productions such as Megapack. They have also engineered Solar Roof over numerous iterations to combine aesthetic appeal and durability with power generation. The efficiency of Tesla’s own solar energy products is aided by their own solar inverter, which also incorporates their power electronics technologies. Both products have been designed to integrate directly with Powerwall.

Technology Goals
          All of Tesla’s goals directly relate to their mission statement and goal of transitioning the world to sustainable energy. With their battery development Tesla is developing their own battery and unique dry cell manufacturing process to reduce production time, waste materials, and produce a highly efficient battery to transition all their cars to 400-to-500-mile range plus vehicles. This new battery process will also allow vehicles to ultimately cost less to the consumer as well. Their goal of achieving full autonomy with all their vehicles that have a Full self-driving computer already installed is very close to being ready. Tesla intends to establish a future autonomous Tesla ride-hailing network, which would also allow give Tesla access to a whole new customer base even as modes of transportation evolve.
          Moving forward, all new orders for Tesla solar or Tesla Powerwall will require both to function. Meaning if you purchase solar you must purchase a Powerwall and vice versa. This streamlines the process of installation to be simple and require less involvement with permitting and PTO (permission to operate) from the city and local energy utility. It also allows solar panels to directly connect to the battery eliminating an inverter and additional installation costs. Public charging infrastructure continues to grow as the Supercharger network adds many large-scale and medium-scale charging stations worldwide, every single year. Supercharger stations are typically placed along well-traveled routes and in and around dense city centers to allow Tesla vehicle owners the ability to enjoy quick, reliable and ubiquitous charging with convenient, minimal stops. Adding stations in areas that are needed the most as customers request specific areas to add a station.

Information Systems Guidelines
       Tesla doesn’t disclose much about its information systems guidelines as a strategic advantage in their competitive market, but we can infer some from looking from the outside. Tesla adheres to a hectic schedule of streamlined goals. Whether it’s during production hell and building a semi-temporary production line to increase capacity or raising and lowering prices without notice. Tesla does this to survive, to maintain focus, and achieve long term growth. Even if some of their tactics seem odd, there’s always a driving goal in mind. Tesla is also a data driven company. With huge amounts off data coming from their vehicles every day, improving self-driving modeling.
          Tesla, like Apple, takes privacy very seriously and Elon Musk has affirmed his stance on the user’s privacy by anonymizing incoming data. He is also very adamant about his stance on AI (Artificial Intelligence) and machine learning. The need to control who has the most power and contain a healthy level of AI to survive as humans. This take would be assumed across all instances of Information Systems at Tesla. We know that Tesla uses AGL (Automotive grade Linux) for most of its computers and lobes to customize their software widely. No secrets are to get out and we can infer everything is very secure.

SWOT Analysis of Technology Goals

Strengths
1. The Power of the Tesla brand name.
The Tesla logo and brand name is everything and highly recognizable around the world. Tesla came at the perfect time when both adoption, innovation, demand, and market shift made them a list of Tesla’s greatest hits. They have the premium electric vehicle market too themselves and are dominating in new vehicle sales. Tesla has become a household name just as Toyota and Honda have but for electric vehicles solely. Leaving behind legacy auto to continuously struggle with miss after miss as Tesla pushes forward getting farther and farther ahead of the competition.

2. Tesla has the best range of any electric car.
Tesla’s focus on consistently achieving the highest range in every category is paramount to its success. In combination with their future battery manufacturing process and efficient powertrains, Tesla will continue to out range perform the competition. The Model S Plaid+ currently has a whopping 520 miles of range. The Model 3 and Y are approaching 400 miles too. Killing the age-old myth that gasoline cars a more reliable because they have a longer range.

Weaknesses
1. Customer service needs help.
While customer satisfaction is the highest of any car on the road today, Tesla customer have some of the worst interactions with customer service. Calling in to customer service to help with an issue takes forever. Getting help during the buying process isn’t as reliable anymore. And wanting to instantly talk to an actual person is pretty rare especially if you have a custom request or project regarding solar.

2. Tesla = Musk
When you think of Tesla you instantly think of Elon Musk. Whatever he does in his life can affect Tesla whether those decisions involve Tesla at all. There’s also the fact that he runs 3 other successful companies at the same time such as SpaceX, The Boring Company, and Nueralink. If he retires or just vanishes, will Tesla survive on its own? I’d like to think so, just as Steve Jobs did with Apple. But it’s remained to be seen and one guy can only do so much.

Opportunities
1. Gigafactories are growing left and right.
Tesla has the opportunity to be in every corner of the world distributing its vehicles anywhere. Currently there is a Gigafactory in California, Nevada, New York, Texas, China, Germany, and now possibly Japan and Asia. This factory blueprint can scale massively and have the ability to out produce very other car brand in both quality and by number.

2. Tesla could sell fleets to corporations so they can meet their environmental goals.
As electric vehicle adoption becomes widespread, bigger companies need to race its gas guzzling fleets of vehicles. Tesla has the opportunity to sell the Model Y, 3, Cybertruck, Semi and a possible van to hundreds of companies around the world. They just need to meet current production demand, helped by the growing number of Gigafactories and refining their production process.

Threats
1. Competition
Competition is heating up from companies such as Ford with its Mach-E and GM with their Bolt and Hummer-EV. Legacy auto still has a lot to learn but they also have deep pockets. So, it is only a matter of time until they catch up. Catching up to charging infrastructure and self-driving technology may take even longer for them as Tesla has an even bigger lead on that technology. Almost every major car manufacturer has recently announced that they will go all electric by 2024 so it’s only a matter of time for Tesla to uniquely position themselves as the better brand.

2. Self-driving car acceptance.
For their self-driving division to succeed they need mass market acceptance of the technology. And as usual with new technologies, people are apprehensive. Other car manufactures may make it normal but that will take some time. Tesla being the first to do it right will always come under scrutiny even if it’s much safter than driving the old fashion way. People don’t like change, it’s human nature, and that will take some time for people to get used to.

Information System Strategic Initiative

Customer Service and PR Overhaul
          Tesla’s current customer service is abysmal and how they respond to major issues and false stories is almost nonexistent since they got rid of their PR team. A plan is needed to fix those faults and remedy their public image on the customer relations front.

        The cost of expanding customer service alone will be an extra 2 million dollars a year plus reviving the PR department at another estimated 2 million. Their current services revenue is 19 million and can easily cover the 4 million needed to fix this issue. This estimated value is more on the high end to cover customer service overhaul and department expansion. This will also be adding a whole new PR department which was recently defunct 2 years ago. So, some infrastructure may still be present, technology moves fast so it will have to be checked and upgraded.
​
          Some risks include an outcome where increased customer service availability may not fix reliability of customer service. This will need to be addressed with a management change and mass department training. Making sure the right policy is written to respond to any case necessary. Being a high-profile company. Any major issue that may arise in customer service should be looked at by PR in anticipation of false information in the media or customer retention on social media. Elon Musk also does not believe in the value of PR and would first, have to allow the department to be reopened, then try to see the value and believe in its greater good for the company’s public image.

            Looking at job listings at Tesla we can infer that their network infrastructure is built very similar to other major players. Listings looking for people experienced in using Juniper and Palo Alto Network firewalls, configuration management with Ansible, Puppet or Terraform systems, and Linux operating system internals knowledge. Docker, GCP, Kubernetes, and AWS experience is also needed. So, it would be a secure Linux based environment, at least on the network side. On the employee side with customer service, Windows is the standard choice on small form factor computers at the desk of employees paired with modern VoIP phone systems. Standard practice seems to be in play here even without the exact knowledge of what programs they use to respond to and look up customers.

            Current network engineers and help desk employees will suffice for the initial expansion. Adding one or two more help desk employees will help with ongoing issues in the expanded user base in the customer service department. There may be a need to add an additional network engineer who already has specific knowledge on customer service systems, especially if a new system is needed. A new system may only be needed once an internal audit is taken on the current feedback from employees.

            Interestingly their current customer support is a division of the sales team. A separation of the teams may help distinguish operations and procedures to better suit the customer’s needs. That way it will be quicker for a customer calling into sales help and another customer calling in for support for their car or solar system. This also avoids any conflicting interest to sell more products to existing customers increasing customer satisfaction and simplifying department processes. This helps the infrastructure team on the software side to create two separate programs to aid employees in the assistance of customers.

          It is not specifically known if a third party or internally developed program is used for support. But we can infer from job listings that it may be internally developed due to no requirements of industry known systems. Just customer service environment experience seems to be needed. Because of the unknown and possibly internal systems used, in addition to Tesla’s position on secrecy, outside contracts and business partners are not used in the PR and customer service divisions. This is only more apparent on their production lines with building contractors, machine companies, and new equipment processes.

            New training initiatives and programs will be created after receiving internal feedback from customer service employees in addition to surveys conducted by Tesla owners who call in as well. As for the PR side, a new take on public relations and press release tactics using social media will garner modern support while connecting directly with the ever-strong Tesla community on Twitter and Facebook. Since dissolution of the previous PR team, Twitter has been the defending ground by Tesla owners and enthusiasts on Tesla’s behalf when any incidents occur. By leveraging social media and precise responses, Tesla will have better control of the narrative while only needing a smaller PR team, when compared to other major tech companies. This approach saves Tesla money and is a modern alterative to mass distributed teams.

            The timeframe of expanding customer service, retraining employees, and recreating the PR division will take roughly one year. This include searching for new hires, expanding network capacity, and with the help of a work from home option, an easy expansion without needing a lot of on-site workspace. This will cost an extra $4 million a year with the possibility of customer service budget expansion later if needed. Adding 25 new employees to customer service and 15 new employees to the newly created PR department. 

To succeed with this initiative Tesla must:
  • Identify their current customer service issues.
  • Accept that this needs to be addressed.
  • Elon Musk will also need to reassess his stance on good PR.
  • Listen to the Tesla community.
  • Identify weaknesses in the sales and support programs.
  • Look at third party software solutions in addition to weighing the costs of bolstering their own.
  • Hire over 40 new employees to staff these changes.
  • Train new and existing employees with new training program.
  • Expand network infrastructure into workspaces.
  • New computer distribution via help desk with new help desk employees.
  • Encourage a work from home environment for customer support.

References
TSLA 10K EDGAR Filing for 2020. (2020, December 31). Retrieved April 27, 2021, from https://www.sec.gov/Archives/edgar/data/1318605/000156459021004599/tsla-10k_20201231.htm
Dougherty, K. (2021, April 14). Elon Musk's complete master plan. Retrieved April 27, 2021, from https://solartribune.com/master-plan/
Running head strategic information system tesla strategic. (n.d.). Retrieved April 27, 2021, from https://www.ozassignments.com/solution/running-head-strategic-information-system-tesla-strategic
Li, Zitong. Strategic Audit on Tesla (2018, April 16) Retrieved April 27, 2021, from https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1055&context=honorstheses
Privacy & Legal: Tesla. (n.d.). Retrieved April 27, 2021, from https://www.tesla.com/about/legal
About Tesla: Tesla. (n.d.). Retrieved April 27, 2021, from https://www.tesla.com/about
Follow Daniel Burns on Twitter, @DBurnsOfficial.
Comments

The future of the personal computer is within ARM's reach: the transition from x86 to ARM silicon.

4/25/2021

Comments

 
Written by Daniel Burns
Picture
Contents:
  • Abstract
  • Introduction
  • What is ARM and why is it used in mobile?
  • X86’s Background
  • Intel’s Stagnate
  • Efficiency and Power - Apple’s Transition
  • Apple’s Previous Experience
  • Apple’s Advantage
  • How Does the M1 Compare?
  • What has Microsoft done with ARM in the past?
  • Learning from Apple’s Success
  • Intel’s Strategy and Moving Forward
  • Conclusion
  • References
Abstract
          This article covers the transition of personal computers to ARM (Advanced RISC Machine) processors. It goes into detail about Apple’s current transition from Intel to their own ARM based Apple Silicon processors. How the transition to ARM and Apple’s Silicon will affect the rest of the PC industry. This paper will discuss the background on ARM CPUs, Apple’s past transition from IBM PowerPC to Intel, and the major advantages to switching to an all-new architecture. I explain how Intel’s innovation stagnation has caused the PC industry to seek alternatives while Intel changes direction. As well as trying to understand why the industry has just recently realized ARM’s full potential even though it has been around for the past 36 years. Other topics will include the details of ARM’s success in mobile devices, Apple’s A-series mobile chips always being ahead of the game, and the breakthroughs ARM keeps achieving over the traditional x86 architecture in both performance and efficiency. I attempt to answer the question of whether it is time to fully transition the rest of the PC industry to ARM based processors.
Introduction
          With the introduction of Apple Silicon M1 based Macs, Apple has awoken the personal computer industry once again. Apple has done this by using the same CPU architecture they’ve been using for the past 10 years in their mobile devices, now brought over to the Mac desktop and laptop platforms. Their new aptly named ARM based chip called the M1 easily outperforms any legacy x86 Intel chip in three major categories: value, performance, and efficiency. Both smartphones and tablets have been using ARM based processors for the past 15 years and have continually shown major leaps in performance and efficiency with each new generation. Specifically, since the iPhone 5S in 2012, Apple’s A-series mobile chips in iPhones and iPads have consistently outperformed stock PC chips and even Qualcomm’s own Snapdragon ARM SoCs (Sims 2017). An impressive feat for Apple’s silicon team staying years ahead of the competition and proving ARMs worthiness in the modern market. Apple’s influence on the industry has caused Windows PC manufacturers to look at alternatives away from x86 Intel and AMD CPUs as they have become stagnant in year over year performance increases. The entire PC industry should learn from Apple’s success with ARM and apply it to their own roadmaps. There hasn’t been a better time than now to transition the entire PC industry to ARM processors just as the industry has successfully done with mobile devices.

What is ARM and why is it used in mobile?
          ARM stands for Advanced RISC Machine and is in the family of Reduced Instruction Set Computing or, RISC for short. RISC describes a computer processor that is small, highly specialized, and highly optimized for a specific set of instructions. ARM is more specialized than other architectures such as x86 Intel chips or other non-ARM complex instruction set computers, CISC for short (Fulton 2020). ARM is leading the way as the up-and-coming processor found in compact devices such as smartphones, tablets, TVs, fridges, speakers, and cars. ARM is found in devices that need to be small and power efficient, as well as devices that need compact, all in one, SoCs (Systems on a Chip). Past processors consisted of a single processing unit on one chip. In contrast, modern ARM processors are SoCs. This means they consist of memory, interfaces, radios, and extra core designs. These extra cores can serve specific tasks, such as dedicated machine learning cores found in Apple’s A-series mobile device chips. Mobile devices have used ARM specifically for its performance per watt. This is a benefit for laptop users because lower wattage enables better battery life. Because of its massive success and efficiency in mobile devices it is probable that the PC industry will look forward to finding the next best avenue in processor technology (Regidi 2020).

​x86’s Background
          ARM processors are becoming more logical for use in laptops and desktops as they are thinner and smaller than ever before while end users become more mobile. Blem, a researcher at the University of Wisconsin, sums up why ARM is the next best option. “Because of the vast difference in how ARM handles instructions compared to x86, the efficiency, power draw, and speed gained from such a small package, performance per watt, in essence, ARM is the ideal choice for mobile devices” (Blem 2013). The x86 architecture has been around since 1978 and was developed by Intel for a 16-bit microprocessor called the 8086. It was revolutionary for its time because it allowed the use of higher capacity memory. This is thanks to memory segmentation, compared to their previous 8-bit architecture. The name x86 comes from the 86 in the name 8086. This naming convention has continued even as the industry recently transitioned from 32-bit to 64-bit computing in the last 10 years. Today x86 is referred to the instruction set both Intel and AMD still use today. For the past 4 years both Intel and AMD, more so with Intel, have been stuck in a stagnate hold battle over their current generation of processors. They have been running hot and drawing too much power, while not improving drastically generation over generation unlike in past iterations. Consider it a type of Moore’s Law roadblock, facing head on with heat constrictions due to an aging architecture.
​Intel’s Stagnate
          Specifically, Intel has been stuck on the same 14nm (nanometer) chip size they’ve been using for the past 9 years. Aside from pushing more cores into it and increasing the base clock and wattage, the performance margins on the aging fabrication are getting slimmer and slimmer. Management at Intel is also to blame as the former CEO Bob Swan was recently replaced by former VMware CEO Pat Gelsinger. Swan was more into management and pushing emerging technology buzz categories such as AI (Artificial Intelligence) and 5G over focusing on what Intel does best, chip development. With Gelsinger’s engineering experience, hopefully Intel can get back to their roots. Apple and Qualcomm are on a 7nm process, beating most of Intel and AMDs fastest CPUs (Blem 2013). Apple is looking into 5nm as we speak and has most likely been on their well thought out roadmap for years. Apple and Qualcomm’s smaller and more power efficient ARM chips have outperformed the older x86 architecture in mobile performance. Due to this Lei points out that “ARM-based mobile smart devices are becoming more and more ubiquitous and the preferred platform for users’ daily computing needs is shifting from traditional desktop to mobile smart devices” (Lei Xu 2012). The smartphone industry has proven ARM’s worthiness in our mobile devices. This makes it highly likely that consumer desktops and laptops will adopt ARM as well.
Apple’s Transition to the M1
            Apple’s transition to M1 will not be easy as it requires at least a full 2 years of transition time period. A ramp in production is needed to fully achieve independence from a tech giant such as Intel or AMD. Apple needs the full support of developers worldwide to back their transition. This will ensure that the end user’s favorite apps work flawlessly or even better than they did on x86 Macs. Apple’s biggest advantage to using their own ARM-based processor is having control of the full stack. Meaning Apple themselves solely design, produce, and ship every single component in a Mac giving them complete and tight control over how their devices perform and how long they should be supported for. Apple has a very strong connection with the developer community, more so than Google and Microsoft with their respective development platforms. On the iOS and iPadOS side, app developers have XCode to build apps with a massive ensemble of tools and kits to help make the app building experience as seamless as possible. This tight-knit process allows a level of integration between software and hardware that is so good, even Google and Microsoft still haven’t been able to fully replicate it.
​Apple’s Previous Experience
          XCode is Apple’s all in one IDE (integrated development environment) used to develop software for macOS, iOS, iPadOS, watchOS, and tvOS. This isn’t the first time Apple has transitioned from one CPU architecture to another. From 1992 to 2006, Apple ran their own set of processors called the PowerPC (Dignan 2020). It was built out of an alliance between Apple, IBM, and Motorola. The PowerPC chip was a RISC based ISA (instruction set architecture) that, at the time of its creation, was a powerful chip. Then, in 2006, Apple began its transition to Intel for one big reason, efficiency. Specifically, efficiency in performance and heat output. Intel’s Core architecture compared to the PowerPC G5 at the time showed a huge difference in heat output.
​          The PowerPC G5 required a massive cooling system to deal with the amount of heat it dissipated. That’s one of the reasons why a G5 powered laptop from Apple never came to fruition. During one brief period before Apple switched to Intel, Apple produced their own a liquid cooled PowerMac G5 desktop due to its tendency to overheat. During this transition, Apple tried to make it easier on developers and users by creating Rosetta. Rosetta was a dynamic binary translation program that allowed for the use and operation of PowerPC only apps on Intel machines. This helped bridge the time gap while developers made their apps work with Intel Macs.
​Apple’s Advantage
          Apple is now once again asking developers to transition their apps, but this time from Intel to Apple Silicon. They’ve made it easier in 2020 with Rosetta 2, a real time instruction translator to allow x86 Intel apps to run on Apple Silicon based Macs. Fulton explains how Rosetta 2 works in macOS:
          “For MacOS 11 to continue to run software compiled for Intel processors, the new Apple system will run a kind of "just-in-time" instruction translator called Rosetta 2. Rather than run an old MacOS image in a virtual machine, the new OS will run a live x86 machine code translator that re-fashions x86 code into what Apple now calls Universal 2 binary code -- an intermediate-level code that can still be made to run on older Intel-based Macs -- in real-time. That code will run in what sources outside of Apple call an "emulator," but which isn't really an emulator in that it doesn't simulate the execution of code in an actual, physical machine (there is no "Universal 2" chip)" (Fulton 2020).
​
          Rosetta 2 has proven to be incredible in its ability to translate most x86 written Mac apps. It translates in real time to rapidly execute on future ARM based Macs. In a YouTube video, Quinn Nelson, from a channel called Snazzy Labs, demonstrated the power of the M1 and its translation using Rosetta 2. Quinn was running a Windows app inside of Wine which translates Windows binary to run on Intel Mac Binary, then Rosetta 2 takes place under Wine’s translation and translates Intel Binary to Apple Silicon. The app or game in this case was running flawlessly, even faster than on a lower specced windows machine (Nelson 2020). This example shows major the significance Rosetta 2 and the M1’s ability to run non-native applications virtually.
How Does the M1 Compare?
          With their A-Series mobile chips Apple has benchmarked significantly higher than Qualcomm in performance measures while also edging closer to being faster than what Intel has to offer (Sims 2017). This is present in synthetic benchmarks, lab stress tests, and real-world usage. In the past two years we have seen that Apple’s performance has exceeded the performance offered by Intel and AMD. For example, the latest iPhone 12 Pro uses Apple’s A14 Bionic SoC, a 6 Core, 3.0 GHz processor paired with 6 GB of ram and it runs a Geekbench score of 3948 on multi core and 1587 on single core. Compare that to a quad core 4.0 GHz Intel Core i7 4790k processor from 2014 with a multi core score of 3945 and 1058 on single (Geekbench.com Data Charts). That’s comparing a full-fledged and unlocked desktop CPU which is cooled by a heatsink to a fan-less mobile chip found inside of a smartphone. This is revolutionary because of the thermal constraints on the A14 and that is able to outperform a full-fledged desktop processor. Sims sums up why Apple’s success has been so consistent:
          "There is no denying that Apple has a world class CPU design team that has consistently produced the best SoCs in the world over the last few years. Apple’s success isn’t magic. It is a result of excellent engineering, a good lead time over its competitors, and the luxury of making SoCs with lots of silicon for one or two products at a time" (Sims 2017).
 
          In November of 2020 at Apple’s “One more thing” event, Apple dropped the M1 powered 13-inch MacBook Air, 13-inch MacBook Pro, and Mac mini desktop. Out of the 3, only two of them are in fact air cooled with an actual fan. The MacBook Air ironically is cooled by a single metal heat shield. The Air does not have a traditional heatsink, rather a single metal pad for heat dissipation (Goldheart 2020). Casey details the performance increase over previous generation machines with an Intel processor:
          "Both M1 MacBook’s are rated for substantial leaps over their predecessors. Specifically, Apple notes the M1 MacBook Air is 3.5x as fast as the most-recent Intel MacBook Air, and that the M1-based MacBook Pro is 2.8x as fast as its predecessor. That's the kind of performance that could topple the Intel Core i9 version of the 16-inch MacBook Pro (which isn't one of the first round of Macs getting ported to M1 chips)" (Casey 2021).
​
          The M1 is an 8 core CPU that’s split between 4 high performance cores and 4 high efficiency cores paired with either 8 or 16 GB of DDR4 memory integrated into the M1 SoC. This allows memory access to be quicker than separated memory chips found in previous models.

          The current M1 equipped machines all score relatively the same, which is expected given that they all use the same M1 processor. On Geekbench the M1 found in the new Macs have an average score of 7681 on multicore and 1740 on single core performance. That puts it right there next to the $10,000 Mac Pro with its 8-core server class Xeon workstation CPU that has a multicore sore of 7959. The M1 also outperforms the $3,000 8-core, Intel Core i9 16-inch MacBook Pro with its multicore score of 6850 (Geekbench.com Score Charts). This is the kind of performance advantage you can expect to see across the board when every Mac switches to ARM.
​        Given the history of the M1 we can anticipate that the upcoming chips found in the new iMac and Mac Pro refresh coming shortly this year will be exponential. Apple has achieved an unheard of 20-hour battery life on the M1 13-inch MacBook Pro. As confirmed in a battery life test by Linus Tech Tips, the M1 MacBook Pro lasted for just shy above 20 hours compared to an Intel MacBook Pro 13 inch at 11 hours, Intel powered Dell XPS 13 inch and MacBook Air 13 inch at 12 hours, then an HP Envy x360 at 13 hours (Sebastian 2020). The MacBook Pro has the same sized battery in both Intel and M1 variants, yet the M1 MacBook Pro lasted for an additional 9 hours due to the M1’s power efficiency.
​What has Microsoft done with ARM in the past?
          It seems as if Apple has its roadmap planned for its successful transition to their own ARM architecture. However, what does that mean for Windows? Microsoft has had its fair share of failures in the past and its attempt to run windows on an ARM processor was not good enough entice consumers to switch. In 2012 Microsoft partnered with Qualcomm to create an ARM powered Surface 2-in-1 tablet. Torres, a writer for Slash Gear, examined Microsoft’s past experiences involved with ARM devices:
          "Ever since the first Surface and Lumia 2520, Microsoft and Qualcomm have been working together to bring Windows to devices that promise long battery life, always on and always connected computing, and lightweight productivity. Save for the short-lived Windows Phone “spinoff”, it hasn’t all been that successful. And as they say, those who don’t learn from the past are bound to repeat it" (Torres 2019).
​
          Windows 10 at the time barely had the ability to translate any apps successfully from an Intel binary to their ARM based binary. Traditionally low powered mobile PCs were still x86 based and utilized a slower processor such as an underclocked Intel Atom or Celeron CPU. ARM on the other hand achieves performance and efficiency with hardly any tradeoffs. Apple has been putting in the resources to develop on ARM for over 11 years, whereas Microsoft has only been developing for ARM during the last 4 years. Torres also elaborates on Microsoft’s failures to successfully integrate Windows on an ARM based platform:
          "Windows’ sojourn outside the x86 world mostly failed on three accounts: software, performance, and expectations. Windows RT launched with an app store that was practically a ghost town and although the Microsoft Store now has more residents, it’s still a far cry from the populous Android and iOS towns. Microsoft sought to remedy that by introducing a compatibility layer that would make win32 software run on ARM hardware. Even on the newest Surface Pro X, performance on that front is inconsistent at best" (Torres 2019). 
          Microsoft needs to look at Apple’s approach to ARM in order to successfully transition in the future.
​Learning from Apple’s Success
          The key to Apple’s success is its tight-knit community of developers. Based off Apple’s success, Microsoft should create new and improved developer tools to help transition, translate, and build apps that run on ARM. Their current lineup of ARM supported apps in the Windows App Store is minimal and as Torres calls it, “a ghost town”. Compare that to Apple’s project Catalyst and XCode tools that allows any iOS app to natively run on macOS with just a click of a checkbox. Apple has millions of apps on the App Store that, with the use of Rosetta 2, fills the transition gap until developers make Mac specific apps for M1. Microsoft needs the support and interest of developers to build a platform that successfully runs on ARM processors. Windows 10 itself has gotten better, especially on the Qualcomm powered Surface Pro X, but it still falls way behind with even the most mundane of tasks. The Surface line is behind in performance and efficiency when compared to the iPad Pro, which does not even run a full-fledged desktop operating system. Bohn, notable tech journalist from the Verge, explains how a tight-knit and clear roadmap is necessary for success:
          "As it so often has over the past decade, Windows offers a roadmap of where things could go awry for the Mac. Windows on ARM still has unacceptable compromises for most users when it comes to software compatibility and expectations. I say this as a person who walked into those compromises’ eyes wide open, buying a Surface Pro X. I essentially use it as a glorified Chromebook and it’s very good at being that thing, but there’s no way Apple would want that for its Mac users” (Bohn 2020).
​
          It is like building an electric car without any charging infrastructure to support it. If Microsoft can fully commit itself to focus on a transition plan to ARM, then they need to commit to a well thought out roadmap to dig themselves out of their current rut. The most crucial step to a successful transition is engaging with developers. Engaging with the developer community with proper tools and support is key to a successful transition as Apple has demonstrated before. For years now big cloud computing companies such as AWS have had servers successfully running ARM based CPUs running calculations and storage farms. If Microsoft wants to compete with macOS and Linux machines in the server industry, then they may need to shift their focus from Office products and start to expand their Windows development team. Bohn points this out by saying:
          "Windows on ARM simply isn’t getting the developer attention and support that standard Windows gets, both within Microsoft and outside it. It was the same with many of Microsoft’s other Windows gambits — simply witness how many times it has rebooted its app framework strategy – a complete mess" (Bohn 2020).
          Microsoft’s strategy is to create a universal operating system for all platforms causing development to be spread thin as opposed to Apple’s strategy where they focus on a singular platform.
​Intel’s Strategy and Moving Forward
          In a press conference Intel recently announced that they will be adopting a new business strategy called IDM 2.0, a $20 billion dollar investment, allowing them to open up its new production facilities in Arizona to produce custom chips for other companies. “As reported by Business Wire, Gelsinger today shared his idea for the evolution of Intel’s business model, which he calls “Integrated Device Manufacturing” or “IDM 2.0.” Intel wants to expand its operations to not only expand the development of its own chips, but also to build chips for third-party vendors” (Esposito 2021). This allows Intel to make money from both manufacturing their own chips as well as building other custom chips.
          Esposito goes on to detail, “As competition has been increasing with AMD and other companies like Apple, which recently began transitioning from Intel processors to its own chips in Macs, Intel now wants to take a step further and open up its production capacity to build and deliver custom chips for third parties. The company wants to use its facilities in the US and Europe to meet the global demand for semiconductor manufacturing” (Esposito 2021). Intel understands that if they wanted to remain relevant in this upcoming market transition, even if they don’t yet have their own ARM chip, they must offer a solution to compete with Apple and Nvidia.
          Gelsinger’s announcement specifically states that Intel will work with anyone to build processors. Gelsinger states: “As part of Intel’s Foundry Services, the company is announcing that it will work with customers to build SoCs with x86, Arm, and RISC-V cores, as well as leveraging Intel’s IP portfolio of core design and packaging technologies.” Gelsinger mentions specifically that Intel will work with RISC-V and ARM architectures to build custom chips which is exactly what Apple uses for its custom Apple Silicon designed processors.
        He elaborated on their interest in working with Apple during the Q&A session of the press conference. “Gelsinger mentioned that Intel is currently working with partners, including Amazon, Cisco, IBM, and Microsoft. But he pushed a bit further during a Q&A session with press, saying that he’s even pursuing Apple’s business” (Esposito 2021). This recent change in direction reaffirms the threat that ARM poses over the traditional x86 platform. By offering custom ARM-based processors to third parties, Intel is able to develop ARM chips for others, before they themselves come up with and solely produce their own ARM based processor.
Conclusion
            Apple’s success on mobile processors have proven to be a modern marvel in how we compute today. Their chip architecture development team has delivered and consistently been ahead of market expectations every single year. Their up-and-coming desktop class processors have shown strength that could influence the industry into transitioning to ARM processors. The transition comes down to the developer community and their interest, combined with the proper developer tools to support new platforms, to run on ARM based PCs. Microsoft has the ability push the industry forward into that transition if they focus their efforts and stick to a single game plan. With Apple’s leadership in pushing ARM to the consumer market, end users will get to experience a whole new era of computing possibilities. All thanks to a singular architecture change that brings performance, value, and efficiency of the likes never seen before. Thanks to this new architecture there has not been a better time than now to transition the entire PC industry to ARM processors, just as the industry has successfully done with mobile devices.
References
Blem, E., Menon, J., & Sankaralingam, K. (2013). A Detailed Analysis of Contemporary ARM and x86 Architectures. Minds at Wisconsin. https://minds.wisconsin.edu/bitstream/handle/1793/64923/Blem%20tech%20report.pdf?sequence=1&isAllowed=y
Bohn, D. (2020, June 10). What Windows can teach the Mac about the switch to ARM
processors. The Verge. https://www.theverge.com/2020/6/10/21285866/mac-arm-processors-windows-lessons-transition-coexist
Casey, H. (2021, January 06). Apple M1 CHIP SPECS, release date, and how it compares to
Intel. Tom’s Guide. https://www.tomsguide.com/news/apple-m1-chip-everything-you-need-to-know-about-apple-silicon-macs
Dignan, L. (2020, June 09). Apple to move Mac to Arm CPUS: What you need to know. ZDNET. https://www.zdnet.com/article/apple-to-move-mac-to-arm-cpus-what-you-need-to-know/
Dilger, Daniel Eran. Apple Silicon M1 13-inch MacBook Pro review - unprecedented power and
battery for the money. Apple Insider. https://appleinsider.com/articles/20/11/17/apple-silicon-m1-13-inch-macbook-pro-review---unprecedented-power-and-battery-for-the-money
Esposito, Filipe (2021, March 24). Intel to build ARM chips for other companies as part of its new business strategy.9To5Mac. https://9to5mac.com/2021/03/23/intel-to-build-arm-chips-for-other-companies-as-part-of-its-new-business-strategy/
Fulton III, Scott. (2020, September 14). Arm processors: Everything you need to know. ZDNET. https://www.zdnet.com/article/introducing-the-arm-processor-again-what-you-should-know-about-it-now/
Goldheart, S. (2020, November 19). M1 MacBook Teardowns: Something old, something new. iFixit. https://www.ifixit.com/News/46884/m1-macbook-teardowns-something-old-something-new
Lei Xu, Zonghui. (2012, September 01). The study and evaluation of arm-based MOBILE virtualization. Sage Journals. https://journals.sagepub.com/doi/10.1155/2015/310308
Lovejoy, B. (2020, July 13). Switch to ARM won't just BE Macs, but better Windows PCs too.
9To5Mac. https://9to5mac.com/2020/07/13/switch-to-arm/
Nelson, Q. (Director). (2020, December 12). Apple M1: Much more than hardware [Video]. YouTube. https://www.youtube.com/watch?v=ff98l3P66i8
Regidi, Anirudh. (2020, November 14). The future of PCs is in Apple's arms: Breaking down the Apple-Intel
Breakup. Tech2. https://www.firstpost.com/tech/news-analysis/the-future-of-pcs-is-in-apples-arms-now-heres-why-8554331.html
Sebastian, Linus. (2020, December 26) Apple made a BIG mistake – M1 MacBooks Review [Video]. YouTube. https://www.youtube.com/watch?v=KE-hrWTgDjk
Sims, Gary. (2017, October 02). Why are Apple's chips faster than Qualcomm's? - GARY EXPLAINS. Android Authority. https://www.androidauthority.com/why-are-apples-chips-faster-than-qualcomms-gary-explains-802738/
Torres, J. (2019, December 25). Windows on arm needs to tell a different story. Slash Gear. https://www.slashgear.com/qualcomms-and-microsofts-windows-on-arm-needs-to-tell-a-different-story-25604295/
​
Follow Daniel Burns on Twitter, @DBurnsOfficial.
Comments

PHP in Healthcare

12/15/2020

Comments

 
Written By Daniel Burns
Picture
As healthcare systems around the world evolve into a fully digital experience, developers in the healthcare industry look for a language and platform to focus their attention on. PHP has not been a slacker by any majority. From its easy database integration to its ever evolving speed and maturity, PHP in the healthcare industry today is a widely adopted tool and it may surprise you just how widely it is used.

    PHP has been around since the mid 90s originally created as a scripting language to interface with binaries written in the C-programming language (5). After iteration and iteration popularity has blown up to create the basis for over 80% of the active web today (1). Thats right, 8 out of every 10 websites you have visited are most likely made with PHP. A couple major sites you may know of is Facebook, WordPress, Flickr, and Wikipedia just to name a few (1). PHP is known for its ease of use and fully integrated server side scripting. Since its been around for a while and still considered modern, you’ll be hard pressed not to find any senior developers to hire who know how to program in PHP. That is not to discredit younger developers either. In the past 10 years, we have seen a market share grow from 11% to 26.2% worldwide. That is only about ~6% less than the already popular C# and ~15% less than Java at 40% of the market share. (6). This gives anyone in the industry hope that PHP as a language should stick around for at least the next 10 years. (Obviously until something way better comes along.) So why choose PHP for programming in the healthcare industry? Simply because of its high availability, adoption, and great integration with databases. Something hospitals have an ever growing number of as we approach an always connected world in 2021.

    From 2004 to 2017, hospital adoption rate of digital EHR (Electronic Health Record) systems has blown up to 96% (2)! And for good reason too. Having an EHR compared to paper records have helped save lives by having live, up to date, accurate information that can be sent from floor to floor, hospital to hospital, or anywhere that’s needed really. This amount of data created is massive and what other language to handle large amounts of data in conjunction with a database than PHP. Data needs to be entered in forms all the time, and having an always available system or webpage to import patient data is vital to the EHR system’s success and PHP is perfect for sending forms of information into a system. PHP is huge in creating web apps in healthcare. An application that can be added and accessed from any device, whether desktop, mobile, tablet etc is an amazing tool to collect and retrieve vital data, records, and insurance connections for the patient (1). The exponential impact on quality of service isn’t just for the patient, but for the healthcare workers, ie doctors, nurses, techs, is highly beneficial to them and the industries success. As a patient, being able to order prescriptions online, schedule appointments, see past visits, or view lab results instantly or as soon as they become available is amazing. Something unheard of 20 years ago! Especially in 2020 with our current COVID-19 pandemic, seeing test results as fast as possible is vital to everyone’s safety and its all mostly done with PHP! PHP isn’t just for web apps too, its a full blow programming language that enables developers to create great applications for any app store to help, for example, connect patients and doctors over video chat. Easy migration of web apps to standalone phone, tablet, and pc apps is a wonderful experience that expedites development, cuts down on costs, and reaches an even broader audience for patients. Something the industry has been trying to solve to save lives and make everyone live healthier as a result. Many EHR systems are connected to massive databases run using MySQL, SQLite, FrontBase, Oracle, IBM, OBDX and so much more, all highly compatible with PHP (1). PHP is famously web based and used for many social media platforms, so it is known to handle large amounts of simultaneous visitors at once. This is exactly what any healthcare EHR system needs to run efficiently, stay available at all times, and cater to any number of users, anywhere in the world (3).

    Development, integration, and high availability enable lots of data to be produced and stored for many many uses. Big Data comes to mind and its impact on the healthcare industry and as a whole (7). With vast amounts of data about patients, visits, location, sickness etc, trends and predictions can be produced to help us as humans understand what’s going on around us. Specifically prediction, finding when the next pandemic may hit and how to isolate it and stop it is now most likely priority number one in the industry. Enabling real time alerts based off location, patient data, and past results is already a thing today. For example, if you have an Apple Watch and lets say you fall with fall detection on or your watch tells you that you’ve been having irregular heart rates, that data can be sent to your doctor with a push of a button to diagnose you before anything major, like a heart attack, could even occur. And with all that data collected, the watch can suggest different workouts and ways to keep you in shape if it detects anomalies that can be corrected. Isn’t that amazing! PHP, if implemented correctly, is also highly secure (5). With more secure ways built right in to handle user data connecting to servers to be stored. After chatting with a few friends in the healthcare developer side of the industry, they all confirm that they have seen a push with prediction, analytics, and just broadly, the impact of Big Data information integration. All thanks to the input and systems that run using PHP. 
​
    This highly customizable, expandable, and reliable platform to develop with here to stay and I can only predict its adoption will grow exponentially in the healthcare development world. PHP is widely adopted, easy to develop on, and highly secure if built right, perfect for the healthcare industry.


Follow Daniel Burns on Twitter, @DBurnsOfficial

Resources:
  1. Sharma, S. (2020, September 07). Why To Choose PHP For Healthcare Web application Development ? Retrieved December 14, 2020, from https://code.likeagirl.io/why-to-choose-php-for-healthcare-web-application-development-28ac76b017e8
2. Non-federal Acute Care Hospital Electronic Health Record Adoption. (n.d.). Retrieved December 14, 2020, from https://dashboard.healthit.gov/quickstats/pages/FIG-Hospital-EHR-Adoption.php
3. PHP for Healthcare Industry - What can be done? - PHP Development. (n.d.). Retrieved December 14, 2020, from https://sites.google.com/site/phpapplicationdevelopmentindia/php-for-healthcare-industry
4. Which is Best Programming Language For Healthcare Apps? (2020, July 06). Retrieved December 14, 2020, from https://www.enukesoftware.com/blog/which-is-best-programming-language-for-healthcare-apps.html
5. History of PHP - Manual. (n.d.). Retrieved December 14, 2020, from https://www.php.net/manual/en/history.php.php
6. Liu, S. (2020, June 11). Most used languages among software developers globally 2020. Retrieved December 14, 2020, from https://www.statista.com/statistics/793628/worldwide-developer-survey-most-used-languages/
7. Durcevic, S. (2020, October 21). 18 Examples of Big Data In Healthcare That Can Save People. Retrieved December 14, 2020, from https://www.datapine.com/blog/big-data-examples-in-healthcare/
Ad:
Follow Daniel Burns on Twitter, @DBurnsOfficial.
Comments

It may be the end for DirecTV as streaming services thrive.

3/10/2020

Comments

 
Written By Daniel Burns
Picture
   This week I was surprised to read about the direction AT&T has put out for the future of its acquired satellite tv service DirecTV. At an investor conference John Stankey, president of AT&T, admitted that DirecTV is no longer seeking subscribers for its service. DirecTV has seen its subscriber count fall from 20 million to 16 million since they bought the company for $49 billion back in 2015. Initially seen as a great addition to AT&T’s media arsenal is now becoming a pain in the neck that might have to eventually be cut off.​
​
    Stankey said AT&T will continue to sell DirecTV but only in “more rural or less dense suburban areas.” Given that DirecTV was originally launched in 1994 as a way for rural subscribers to watch tv in areas not covered by traditional cable, its not much of a surprise that strategy isn’t working too well today. “But in terms of our marketing muscle and our momentum in the market, it will be about software-driven pay-TV packages." AT&T TV (Starting at $50), not to be confused with dying Uverse, made its debut this past week and has been greeted with poor response due to its need for equipment rental and a two year contract upon signing up. This structure is exactly what people are trying to avoid by streaming in the first place. Anyone can go out and buy an Apple TV or streaming device that can run apps and services such as Netflix, Amazon Prime, Hulu, and YouTube TV. So the need to rent a device that can do the same thing doesn't appeal to consumers anyone. 

    AT&T isn’t the only company trying to push this model of renting their own device. Both Cox and Comcast have a wireless standalone smart box that runs on Comcast’s X1 platform allowing basic cable and apps over the internet. It just doesn’t make sense to rent a box anymore to watch tv. If you just buy an Apple TV 4K for example, it offers a way more premium experience to users than anything big cable can put out. In fact the AT&T box required for its new streaming service is just a custom made Android TV box with its own AT&T branded skin on it. Nothing too special about it at all. So AT&T’s decision to cut back on DirecTV’s service ends up making a-lot of sense from a marketing perspective.

    One big draw to DirecTV is its NFL Sunday ticket which gives you access to every live game across the country in one package deal. But in two years their contract with The NFL is set to expire and it doesn’t look like AT&T is planning on renewing it either. As AT&T looks to expand its presence in the now crowded streaming market with HBO Max and now AT&T TV, it needs to shift its resources away from aging DirecTV. There have been talks of selling off DirecTV after Sunday Ticket expires to satellite competitor Dish Network. Who ironically also owns Sling TV in the streaming TV space. According to Wall street analyst firm MoffettNathanson; about 6 million subscribers ditched satellite and cable in just 2019. If this were to happen Dish would be the only satellite TV provider in the US to cater to those who still need satellite tv with a combination of over 25 million subscribers. With the emergence of 5G and internet seemingly almost everywhere these days, steaming tv is bound to be the go to when available. Whether the streaming market is becoming too crowded is another story to cover.

    So, what happens to anyone wanting to sign up for DirecTV in urban areas? Looks like its tough muck for them. Just switch to streaming because it's the end to a long and pricy era in traditional pay tv. There are two main live tv streaming services that I see still remaining. YouTube TV ($49.99), which is owned by Google, and then there’s Hulu with Live TV ($55), which is owned by Disney. Both services have big backing and carry so much potential to survive if they can keep prices down. Or at least low enough when combined with the price of internet so that it's still a better deal than the traditional cable package of yesteryear.

   In my strongest opinion, cable tv is just about dead and DirecTV is just another nail in the coffin. It was a terrible purchase by AT&T and I’m pretty sure they see that now. Before AT&T bought them their service was decent, very competitive to cable (at least in the first year), and their technology was actually pretty good. But it all went down hill as they were bought, left to rust, and streaming tv apps on flashy new platforms looked a lot better. So here’s one last toast to DirecTV as its put out to the pasture and sold off for scraps.

Follow Daniel Burns on Twitter, @DBurnsOfficial

Ad:
Comments

If music streaming services want to survive, then original content may be the answer.

1/4/2020

Comments

 
Written by Daniel Burns
Picture
      Jimmy Iovine is an American record producer who founded Interscope Records in 1990.  In 2006 he and rapper Dr. Dre founded Beats Electronics. Apple bought Beats and their just off the ground music streaming service in May 2014 for $3 billion dollars. Apple then turned that service into Apple Music and still continue to make and sell Beats products today, but with an Apple accent. I recently read an article from The New York Times written by Ben Sissaro interviewing Iovine on leaving Apple. And it was what he said about the music streaming industry that got me thinking. Music streaming services need something to set them apart from each other and, at the moment, they are all pretty much the same. Iovine outlined a central problem with streaming services today calling them utilities.

    “And the streaming music services are utilities — they’re all the same. Look at what’s working in video. Disney has nothing but original stuff. Netflix has tons of original stuff. But the music streaming services are all the same, and that’s a problem.”

    What he said is what I hope other streaming services pick up on loud and clear. Music streaming services need exclusivity and original content to survive. Much like Disney and Netflix have original content they produce for themselves. As we know with original content in video, it draws people in and creates a loyal fan base of viewers who come to watch exclusively on a platform for a certain show or movie. But with music, all the services pretty much have the same catalog of artists and albums to choose from. Aside from special exclusive releases, it all ends up on all the services eventually. You don’t see a Netflix made show on Amazon Prime Video and you will probably never will. So what’s stopping, let's say Apple or Spotify from doing the same thing? The only thing comes to mind is record labels and contracts with artists. If Apple would sign artists to do exclusive content just like they do with TV+, then I think they would succeed over Spotify by bringing in more loyal listeners. It wouldn’t be very hard for them either. Apple has had a way better track record of having good relationships with artists versus Spotify. If you remember the Taylor Swift pay rights debate, Apple responded. Spotify has been in deep water before by paying artists less on their service compared to Apple. This isn’t just happening with streaming tv video, the online streaming gaming community is seeing this happen with Amazon’s Twitch competing with Microsoft’s Mixer. Mixer picked up an exclusive deal with previous Twitch streamers Ninja and Shroud in hopes to attract more subscribers to its new service. To some degree it has succeeded. Iovine highlights the biggest problem with growing a music streaming service.

    “It doesn’t scale,” Iovine said. “At Netflix, the more subscribers you have, the less your costs are. In streaming music, the costs follow you.”

    Even with a healthy growing number of subscribers the same pay rate per play, due to contracts and equal opportunity deals with artists and record labels, doesn't go away unless the actual service, like Apple, were to sign with the artist, produce and record in house, and release their song or album exclusively as original content. Apple has some very smart people working for them and Spotify does too. So one day I suspect this will happen. In fact, this will need to happen to keep rates the same and to make music streaming more profitable. This also applies to Spotify's growing podcast service in competition to the gold standard, Apple's stand alone Podcast app. Whether or not this leads to tiered pay levels of service remains to be seen. But something has to be done and the name an Apple Music Original doesn’t so sound crazy.

Quoted WSJ Article

Follow Daniel Burns on Twitter, @DBurnsOfficial
Ad:
Comments

Five Mobile Device Security Best Practices

12/4/2019

Comments

 
Written by Daniel Burns
Picture
     Our phones and mobile devices are one of the most important pieces of technology in our every day lives. They are with us all the time and know practically everything about us. Probably even more so than our best of friends. All that information is personal data that is stored on that device just waiting to be taken. So we must take precautions to protect our personal privacy and our data from prying eyes. Here are five best practices to keep your personal data safe and your phone away from unwanted data lurkers.

     First and most importantly is physical security of your device. You want to make sure that your device is with you at all or most of the time. Especially in public places. Someone can easily take, clone, or steal, your device if it is open and vulnerable to these situations. To combat this, make sure you have a strong passcode or lock screen password of some kind on your device. The standard for years has been 4 digit passcodes but now 6 digit passcodes are becoming the norm just because it increases the amount of combinations needed to unlock the device. Also don’t use a simple code like 1234 or your birth year like 1998. Those can be too easily guessed and especially avoid using any other simple code such as address, significant dates, social security number… Something you will be able to remember but no one else could. Next you will want to enable biometrics on your device to add an extra layer of protection on top of just a simple passcode. Passcodes could easily be seen typed or tapped in public and prying eyes can easily remember a simple code. Using biometrics makes it easy, simple, convenient, and with today’s cool new technology, even fun. There are two main uses of biometrics to unlock your device that are popular on the market today. Fingerprint readers and facial recognition. These are conveniently offered during setup of the device and can be found in your perspective devices settings to enable them. It really feels like the future to just glance at your phone and it just unlocks for you. Avoid using public wifi such as “Starbucks Wifi” when you can, especially when using it for accessing banking or secure and or secret work information. Wifi networks can be spoofed to gather data and you will not even notice it happening. Disabling Bluetooth and NFC when not in use can also help your device security. This usually pertains more for Android users than iPhone users though.

    Second we have passwords in general. Password best practices have usually included using long and complex passwords consisting of many numbers, letters, and characters but they are usually hard to remember so people tend to write them down. To avoid this and the security nightmare of people putting sticky notes with passwords on them throughout their office, use long but simple and memorable complex passwords. They usually consist of a memorable sentence or phrase with some numbers and special characters rather than a mix of mealiness garble. If you still can’t remember and just want an even easier way to save your passwords, use password management apps to remember your passwords for you. Just make sure they and encrypted and secure with a master password like Apple’s Keychain, 1Password, and others like Dashlane. Most of them now, even third party password management apps, unlock and auto fill passwords for accounts by using biometrics such as FaceID and TouchID. An easy, convenient, and secure way to keep passwords safe. Thirdly, speaking of encryption, make sure you enable it on your device and use encryption whenever possible. Many modern devices have the option to enable encryption and secure your device if it were ever taken. In fact all iOS devices come standard with encryption and on Android you can enable it in settings. (Just make sure your Android device is not rooted and up to date.) Using encrypted services like messaging apps for example keep your messages hidden and unreadable by anyone else but the receiver. This is why iMessage and WhatsApp are preferable over traditional SMS or other direct messaging services and apps.

    Fourthly and in my opinion the most important on an entirety level is keeping your devices up to date. If you are running the most recent and up to date software then good on you! That means you are less vulnerable to attacks and less of a security risk for you and even your business. Software updates don’t always introduce new features and visual improvements, they are extremely vital to keeping hackers and viruses at bay. They often and always include security patches and bug fixes to shore up vulnerabilities in the operating system. So turn on automatic updates for both apps from the app store and your operating system to remain on the leading edge of security. Plus you get to enjoy those new features on the side. Fifth and last we must understand the difference between trusted and untrusted sources. This pertains to downloading and installing apps from sources from many different places and whether those said places are to be actually trusted and clean. A trusted source is your perspective app store for your operating system. On iOS it is the App Store with its blue app A icon and on Android it is the Google Play Store with its colorful play button icon. Downloading apps from these places are very safe and don’t include any harm to your phone such as viruses and worms. Untrusted sources on the other hand are places like random download sites and file sharing services that let you install unknown apps. This is mostly an Android and Windows problem due to the open nature and non walled garden structure of their operating systems. So install antivirus software to check for any strange attachments that may have come with a download and make sure unknown sources is disabled in settings.

Picture
    So if one were to follow these five simple precautions with their mobile devices then they should have nothing to worry about when protecting their personal data. Our phones and devices know practically everything about us. So any rational and ethical human would want to protect that data from falling into the wrong hands. Keeping your devices on you, protected and secured, and up to date while using trusted sources can ensure that normalcy of privacy continues. Aside from the regular data mining from big brother Google that is.

Follow Daniel Burns on Twitter, 
@DBurnsOfficial

Ad:
Comments

Geotracking; Privacy versus Convenience

11/13/2019

Comments

 
Written by Daniel Burns
Picture
     Geotracking is the act of services and platforms using location data gathered from a device either from GPS coordinates, IP addresses, and bluetooth connections, just to name a few. It has long been used as a convenient way for users to have a better experience while using mapping, friend finder, social media apps, and other services like Uber. But does all this location tracking and its gathering of data outweigh its good and convenience in our technology driven society?

    To answer that question we must understand why we love the abilities of location based services on our many devices. For social media users location services allow you to check in and tag a location to a post or photo to share your experience with friends and family. For families using people finding apps such as Find My (iOS) and MyCircle (Android), for example, make it easy and convenient to check up on your kids and family member locations aside from how intrusive of personal privacy it may be. Also for using simple apps such as Google Maps and Uber or Lyft which require your location to even function properly. These are just some of the hundreds of examples of how we use location tracking services throughout our daily lives. But Geotracking isn’t just location data based on just gps coordinates, it can be gathered from other entities such as IP addresses and bluetooth connections. For example, if you go to Google.com and you search for a local business or service it will ask you to use your location. Everyone will hit yes and in doing so you are allowing Google to use your computer’s IP address to locate you in your community and, nonetheless, your current region. Yes that’s right, by even visiting sites that don’t even ask to use your location, companies and sites can see where you are generally located (city for example) to tailor things such as advertisements and specific services just for you. And all that data is out there for anyone to use whether for good and in your favor or for bad and malicious intent.

    Two major concerns have come up with geotracking and its respective location tracking nature. One asking what does geotracking actually reveal and two being what are my rights and what can I do to protect myself. Firstly, geotracking can reveal many privacy concerns in our daily lives. For example if you are going on vacation to, lets say Hawaii, and you post pictures to Facebook with your location attached, if your post isn’t private (only to your friends) then anyone with malicious intent, like burglars, can see and they will now know that they can take their time cleaning out your house while you are gone. Another example of privacy concern, a major one in fact, is when a big exercise and activity tracking app called Strata decided to celebrate how useful their app was, they posted a map of the entire world and all the routes users have taken to exercise with their app. A really fun and cool interactive map that showed just how big their user base was. But unfortunately it even showed locations of activity in parts of the world, such as the middle east, where secret bases of operation were, and it essentially blew those users cover. All because of a simple app that logged your activity routes. Crazy huh? Second, all this can be avoided if you always know, in the back of your head, that nothing is truly private and your are essentially always being tracked. You can either disable location services all-together on iOS or Android, or you can specifically disable or give rule on a per app basis (new in iOS 13 and now Android 10). This can also be done in apps on macOS and Windows inside of their respective system settings.
​

    Just the act of being aware of geotracking can be enough to change your habits as a user to be more proactive in what you do. But for most things it's something that is inevitable and we allow it for the convenience in our lives. With ever emerging technologies using any kind of location based information we must ask ourselves, how does this actually help us, and is the trade off ultimately worth it.

Follow Daniel Burns on Twitter, 
@DBurnsOfficial

Ad:
Comments

The State of Wearable Devices

10/9/2019

Comments

 
Written by Daniel Burns
Picture
With the improvement of technology and miniaturization wearable tech is getting more and more advanced every year.  Wearable devices consist of three main categories; smartwatches, health and fitness devices, and AR/VR (Augmented Reality and Virtual Reality). Just like smartphones and computers in general, these devices are getting smaller and smaller while gaining power and capability the world couldn’t of imagined 5 years ago.

    For smart watches, the dream started in the 1930’s on television with the imagined radio commutation watch warn by Dick Tracy himself. Today this is a laughable reality with the addition of popular smartwatches being constantly connected, used as communicators, every single day. Smart watches started out as being a companion device to the smartphone. Connected via bluetooth, the smart watch was a way to display notifications and alerts on your wrist, track steps, and later monitor heart rate. Most of all still do very well today as those are the bare minimum specs to become a “smart watch.” There are a ton of smart watches on the market today from Apple, Samsung, Fitbit, LG, and the now defunct Pebble. But let’s look at the Apple Watch as the gold standard of smartwatches. After all, the Apple Watch now towers over any completion thanks to its massive 80+% growing market share. It started out as a really stripped down companion to the iPhone with all the basic features, an OLED display, thick shell, and only single day battery life. Now the current Series 5 is bigger, thinner, more powerful than the first 4 iPhones, has an always on display, and connects via LTE and Wifi alone when the iPhone isn’t in range. The Apple Watch has become the next iPod so to speak with its small form factor, and ability to stream and store millions of songs directly form your wrist with the help of some bluetooth EarPods or AirPods. Since the introduction of the Series 4, Apple has added the ability to take an ECG anytime on your wrist in addition to regular heart rate monitoring. Which is crazy cool and the first of its kind to be FDA approved!

    Which perfectly leads us into the health and fitness category of wearable devices. There are a numerous players in this category from the makers of Polar, Garmn, Nike, Basis, Withings, and now including Apple. These devices don’t just monitor heart rate and activity, but some can track blood pressure, sleep patterns, and other types of physical activity. Some devices can be worn around your chest with a strap, belly, arm, and wrist. Most people now prefer the wrist because of its convince and less restrictiveness during workouts. In the past chest worn devices were more accurate than wrist warn but every year sensors, data, and now AI has filled that gap along with the previously mentioned user convenience towards wrist warn being the preferred. The Apple Watch and some other devices have been credited numerous time with saving peoples lives thanks to early detection using the devices. For example the Apple Watch will let you know if you have afib thanks to monitoring for irregular heart rates, high heart rates, and now low heart rates. May have been pre diagnosed before doctor confirmation of symptoms thanks to the incredible capabilities of the Apple Watch and wearable devices as a whole. You can even thank the device in emergency situations when your phone is out of reach thanks to fall detection, automatic 911 calling, and on star type emergency button convenience.

    Other types of wearable devices now include AR and VR goggles and glasses. With VR or virtual reality googles, you can put them on and step into a whole nother world dreamed up by game developers. VR goggles and handsets rest over your head and slide onto your face covering just your eyes and ears. Using two small but high pixel dense screens and lenses, the VR goggles let you see something unlike the real world and move about the virtual space to play and explore to your hearts content. Now AR or augmented reality you overlay information on top of the real world. This can be done with glasses and small transparent screens or projections onto your eyes. It can also be achieved on your smartphone using its camera and moving about the room. In both situations you can use your hands and controllers to play and position items. It really is a unique experience that is best just explained by actually doing it in person. VR and AR is powered either internally in the goggles or glasses like a smartphone or externally through a one cable solution connected to a gaming computer or a computer with a graphics card capable of powering the experience. The most successful VR devices on the market are from Oculus and HTC with the Rift and the Vive while AR glasses have been made by Google with the Google Glass or Microsoft with its Hololens.
​

    Wearable technology has come a long way thanks to the evolution of technology and Moores law. Miniaturization of components and complete systems have enabled devices like smart watches and VR glasses to become more powerful and more capable. I can’t wait for the day we wear AI powered communicators and step into full room virtual worlds just like Star Trek and The Orville alike have imagined. Dream on and virtual reality might just become our reality.

Follow Daniel Burns on Twitter, 
@DBurnsOfficial

Comments

What should be considered when selecting a monitor?

9/17/2019

Comments

 
Written by Daniel Burns
Picture
When it comes time to buy a new monitor to go along with your new computer, in addition as a second display, or even replacing an outdated one for your current setup, the decision you have to make may seem like daunting task. Not to worry, let's break down the three main types of displays and their varying technologies to find the best use case for your next monitor. We will be going over specifically flatscreen monitors , after all it is 2019 and no one is buying a CRT (Cathode Ray Tube - Tube Type Large Monitor) brand new anymore.

    Let’s start with size, one of the most important aspects to when selecting a monitor. Get it? Aspects? Anyway, if your current monitor has a small screen size of lets say 17”- 22” inches, measured diagonally, and you think its on the small size then by all means upgrade to your hearts content. Of course there are limitations to size when it comes to how much room you have on your desk, what it’s being used for, how far away do you sit from the screen, and then of course budget. Monitors, when compared to TVs have a different guide or general rule to how close you should sit in front of them. You can do a quick Google search about monitor distance relevant to your size that you want to select. You will even find calculator to input custom distances and sizes aside from the basic charts. For example if you have a 24 to 29” inch monitor the general rule is to sit an arms length away from the screen or about 4 feet for a healthy dosage of screen time. You will hear results will vary depending on use case allot during this paper because it really does. 

    Once you get your monitor size down now you will need to establish a budget and main use case. Generally if you want to save money the smaller the size of screen, then the cheaper they get. Then the bigger, the more expensive it gets, not including additional features and creature comforts. If you do everyday light email browsing, web surfing, and word processing then a cheaper and smaller screen with a resolution of 1080p at about $80 to $200 is fit for you. If you do that plus some light gaming then a bigger screen with a resolution of 1080p or 1440p that costs $200 to $300 is your ticket. But let's say you do some intensive gaming and edit photos and videos on the side, then expect to have a budget of $400 to $1,000 with a 4K resolution. Then at the very top of the market and you want it all, you’re an extreme gamer, or edit large videos and photos, then you already know you will be spending $1000+ on a single monitor with at least a 4K (2160p) or even 5K or 6K resolution these days. So adjust your budget to screen size and use case for the most part and screen technology will follow suit.

    Now let's get technical and get into the different types of monitors you may come across. Mainly three types of flatscreen technologies. First you might find cheaper and often smaller TN Panels which stand for twisted nematic. This is a type of LCD (Liquid Crystal Display) LED display that sacrifices color accuracy and viewing angles for perforce and budget. A TN panel has some of the fastest response times and high refresh rates for a display. For gaming, video editing, and just plain pleasure, having really fast response times and refresh rates is top priority. But a TN Panel also has some of the worst viewing angles on the market, meaning that when you view the screen at any other angel then straight on, colors, brightness, and clarity is greatly reduced. Next we have the middle child of display technology, the one that usually rides the line between price and performance. The VA panel or vertically aligned usually has the longest and slowest response times but really high refresh rates. You might not find a VA panel as often as a the other two because it's generally the most expensive. Best suited for viewing slo motion, high resolution playback during sporting events or slo mo shots on the shoot thanks to its high refresh rates. Colors and viewing angles on a VA panel are usually better than a TN panel thanks to its high contrast and image depth but when compared to the mack daddy of monitors, the IPS panel, it squashes them both. The IPS display, also called In-Plane Switching, is what you will generally find these days. Your TV is probably IPS, your phone’s screens are definitely IPS and most laptops are now IPS too. IPS panels give you the best color accuracy, contrast, brightness, and viewing angles out of any monitor. You can practically look at an IPS monitor at any angle, let's say 180 degrees to the side and not notice any degradation to quality and brightness. But they are currently the most expensive on the market for very good reason. Their refresh rates on the other hand are pretty lacking when compared to the others. But every year they are getting better and better and we are getting to the point where most IPS panels are good enough to be in a cheaper device or monitor.
​

    So let’s wrap this up. It all really comes down to your use case. For the average user and every day Joe Shmo in 2019 a 1080p, full HD IPS monitor, with a screen size of 21 to 24” that costs around $160 would be great for the masses. If you’re a gamer or video editor then a high refresh rate, 4K, IPS monitor for $600 with a size of 27 to 32” is your best bet. You will always want to look at longevity and being “future proof” as a factor for spending more as well. Then if you have no budget, well, go crazy and treat yourself. Once again it comes down to use case and budget, then the rest usually falls into place.

​Follow Daniel Burns on Twitter, @DBurnsOfficial


Comments

Best Ethernet Wiring Options

8/22/2019

Comments

 
Written By Daniel Burns
Picture
Today there are so many different standards in data transmission and there’s one standard that has always been reliable and that is still in use today. Ethernet has been around since the 1980s and was standardized in 1983 as IEE (Institute of Electrical and Electronics Engineers) 802.3. Ethernet is a standard way of data transmission of 1s and 0s. Typically measured in megabits per second and is used for all types of data transmission. 
​   
There are over 7 different types of categories of ethernet wiring today and as we require faster and faster speeds and data throughput, there will always be a need for faster standards. Let's start with the basics first. An ethernet cable is typically made up of thin twisted pairs of insulated copper. They are twisted together to reduce something called crosstalk. Crosstalk is when you have data transmitting wires close or next to each other causing interference and thus issues typically resulting in slower speeds or connection dropout. By twisting the wires together, you prevent them from interfering with another. To also help with interface, ethernet cables can have extra shielding around each pair of twisted wire, separation and rigidity, thanks to plastic, and an aluminum foil like wrap just beneath the cables outside jacket. The tighter the twist, the better the signal will travel. 
    Ethernet cables are rated in categories and they go as followed. Category 1 ethernet consists of two twisted wire pairs (four wires total) and is the oldest type of the bunch. It is no longer used today and was only rated for voice data. You would typically find this cable inside older building used only for phone lines. Each category name can be condensed to “Cat” followed by the version of the cable. Ie: Cat1, Cat2… Next we have Cat2 ethernet. It consists of four twisted wire pairs (eight wires) and handles up to 4Mbps with a frequency maximum of 10MHz. This is also no longer used today. Cat3 is made the same way but has an extra 3 twists per foot and can handle up to 10Mbps. It is only used for telecommunication (phone) equipment today. Cat4 is the same but has more twists and is rated for 20MHz and is obsolete. Are you seeing the pattern yet? Cat5 cable is the same except each pair is now twisted separately then twisted with the other pairs. It is rated for 100MHz and is also known as 100BaseTX. Next up we have the most commonly used cable today called Cat5e. The E stands for enhanced and it is exactly what it's called. An enhanced version of Cat5. The big difference in 5e is that its cable can handle transmitting on all four pairs of wire at the same time allowing it to handle gigabit speeds or over 1000Mbps! It is also referred to as 1000BaseT and you should note any category below 5e should not be used in today’s networking applications unless you absolutely have to. 
    My favorite ethernet version called Cat6 is the next best thing for price per performance or should I say price per foot. Cat6 (1000BaseTX) is rated for 250MHz and became a standard in 2002 as riser cables to connect floors together. Today it should be used in all new applications and retrofits instead of Cat5e. Cat6A (10GBaseT) is similar to the jump from 5 to 5e in that the perforce increase warranted its own category. 6A stands for Augmented and can handle up to 500MHz with major improvements to crosstalk elimination. It can be run up to 100 meters (328.08ft) and has a power transmission rating of 3db. Cat6A will be the next big cable to used for any application. There is a Cat7 and it allows and astonishing 10 gigabit per second over 100 meters of copper but as of 2017, it still isn’t recognized as a standard, yet. Although Cat6 is still capable of 10 gigabit data transfer connections locally, and often used with 10 gig network cards in video editing applications. There are Cat8 cables that are extremely fast but we don’t need to cover them, especially since it isn’t fully developed yet.
    So the best version or type of ethernet cable everyone should be using is Cat6 because of its cost effective performance and its best suited for 90% of applications. Of course if you need more speed and you have the cash, then Cat7 sounds pretty neat. And when you are shopping for cables, remember to not get too wired up about it.

Follow Daniel Burns on Twitter, 
@DBurnsOfficial

Comments

Compared: Hyper-V vs VMware

7/17/2019

Comments

 
Written By Daniel Burns
Picture
    Virtualization has been around since the 1960s as a method of logically dividing system resources provided by the mainframe computers between different applications (Virtualization - Wiki). Today we use it to run applications, operating systems, and even virtual hardware within another system. Hyper-V is virtualization software developed by Microsoft to run on Windows and Windows Server platforms. It can not only virtualize operating systems but also entire hardware components, such as hard drives and network switches. Unlike Fusion and VirtualBox, Hyper-V is not limited to the user’s device. You can use it for server virtualization, too. (Iva 2018).

    Hyper-V is a Windows Server add on that allows you to run multiple virtual servers on one dedicated server. It is also available for Windows 10 as a way to run almost any operating system on top of Windows as well. To enable Hyper-V on your Windows device, you’ll need to be running a 64-bit version of your operating system with a minimum of 4GB of ram (Iva 2018). Although Microsoft recommends a minimum of 4GB of ram, it is better to have 8GB or more because it's exactly that, a minimum. Any virtual environment will need resources such as CPU cores, ram, and storage space. The the more you can throw at it without hindering your base operating system, the better your virtual environment will run. So why might you use a virtual machine? Well you might have an application that simply can’t run on your current operating system because it is either too old or incompatible. Instead of partitioning your drive to install a second operating system and boot into it, Virtualization allows you to run that second OS inside of a window on your current OS. Pretty neat huh? For example, I have an older version of a CAD program that no longer exists but it can only run on Windows XP. I have specific files for that program that need manipulating or converting but I can’t run it on my Windows 10 computer. So using Hyper-V or any other program like VMware Fusion or Oracle’s VirtualBox, I can load up Windows XP and get my work done. All without leaving my desktop and messing with its boot configuration. You can run any operating system from Windows, Linux, and even macOS on a PC with some custom boot loader configurations.

Picture
    Hyper-V comes preinstalled with Windows 10 so you don’t have to pay for or download anything extra. To use it you’ll have to go to Control Panel, click on “uninstall a program,” then select “turn Windows features on or off” on the left side. Next scroll down to Hyper-V and check the box to enable it and select OK. You can also enable Hyper-V from Command Line using this command: “DISM /Online /Enable Feature /All /FeatureName:Microsoft-Hyper-V.” After its turned on you can now run it from search and setup your virtual environment in accordance to your needs. Number of CPU cores, ram allotment, storage size and where it's stored, and so much more advanced settings to configure such as sound, networking and specific use cases etc. Not only can you use it for running older software, you can do so many task specific things to fit your needs with virtualization and Hyper-V. Whether you experiment with other operating systems and or unreleased work in progress programs. Hyper-V makes it very easy to create and remove different operating systems to your hearts content (Microsoft). Test software on multiple operating systems using multiple virtual machines. With Hyper-V, you can run them all on a single desktop or laptop computer. These virtual machines can be exported and then imported into any other Hyper-V system, including Azure (Microsoft).

    Now Microsoft’s Hyper-V isn’t the only virtual machine software on the market today. They have major competition from VMware’s paid software, VMware Fusion and vSphere and some from Oracle’s free program, VirtualBox. Let's focus directly on VMware as they compete directly to each other. Virtualization remains one of the hottest trends in business IT (Collins 2019). VMware offers a few more advantages over Hyper-V other than its major disadvantage of being a paid program. VMware has been in the server virtualization market for far longer than Microsoft. The company has been in business since 1998 and shipped its first product (VMware® Workstation) the following year. VMware released its first true server product (ESX® Server) in 2002 (Posey 2017). Hyper-V was only made available in 2008 when Microsoft announced it alongside Windows Server 2008 and it only made its was to the consumer in 2012 with the release of Windows 8. So VMware has more maturity and experience when it comes to background. Both offer almost identical feature sets and their own learning curves so its up to you or your company to decide what is best suited for your needs (Posey 2017). VMware is a bit more consumer friendly with its user interface and some novice users may find it easier to work. Both should be around for a long time and support shouldn't be considered a top issue to contend to when considering one over the other. VMware’s core hypervisor is less expensive than Microsoft’s. However, Microsoft’s management server cost less than VMware's (Posey 2017).

    So when everything is considered for your own needs, it really comes down to what is best suited for you. As a consumer, you can’t go wrong with Hyper-V being included with Windows and readily available. Virtualization software has come a long way and the dream of having one physical computer run multiple under one roof is becoming easier, faster, and cheaper to do. Next up, cloud virtual computing on every connected device!… someday.

​Follow Daniel Burns on Twitter, @DBurnsOfficial

References:
Docter, Q. (2018). CompTIA Security+: study guide. Hoboken, NJ: Sybex.

Hyper-V. (2020, February 28). Retrieved from https://en.wikipedia.org/wiki/Hyper-V

Scooley. (n.d.). Introduction to Hyper-V on Windows 10. Retrieved from https://docs.microsoft.com/en-us/virtualization/hyper-v-on-windows/about/

Scooley. (n.d.). Enable Hyper-V on Windows 10. Retrieved from https://docs.microsoft.com/en-us/virtualization/hyper-v-on-windows/quick-start/enable-hyper-v

Iva, Steve, Z. I. T. (2018, October 10). What Is Hyper-V & How Do You Use It? A Beginner's Guide. Retrieved from https://www.cloudwards.net/hyper-v/

Shinder, D. (2017, June 23). 10 things you should know about Hyper-V. Retrieved from https://www.techrepublic.com/blog/10-things/10-things-you-should-know-about-hyper-v/

Collins, Tom. Hyper-V vs. VMware: Which Is Best? (2019, December 19). Retrieved from https://www.atlantech.net/blog/hyper-v-vs.-vmware-which-is-best

Posey, B. (2019, October 7). Virtual Infrastructures: Hyper-V vs VMware. Retrieved from https://www.solarwindsmsp.com/blog/virtual-infrastructures-hyper-v-vs-vmware
Comments

One week with the Apple Watch

5/24/2015

Comments

 
Picture
“Watch Out Android Wear!”

One Week with the Apple Watch.

After having the Apple Watch for over a week now I have found it to be a very enjoyable experience. From being in the classroom, glancing at notifications, to being in PE or out for a run, logging my activity. Its actually been a very useful tool. This new product comes from the post Steve Jobs era with new CEO, Tim Cook at the helm, heading in the right direction for Apple’s future. The interface is of something new to iOS users. It could be a slight learning curve, at first, but to navigate the new interface Apple has made it easier by making things fluid and familiar to its bigger brother operation system, iOS. It just takes getting used to and after less than a day having it, I got the hang of it.

The heart rate tracking is quite fun actually. Its pretty cool to see your heart rate thought the week and with Apple’s new HealthKit, communication with your doctor is easier than ever. The new ability to submit and or show your weekly and daily blood pressure and heart rate to your doctor is amazing. No more high blood pressure spikes at the doctors office. Now you have a more accurate outlook on your health. Sleep tracking is kinda neat too. Its interesting to see your lowest heart rate while sleeping and an accurate time of your sleep.

The battery is as expected for a first generation device. It just depends on what you do with it in that day. If you do a lot including workouts, notifications, and checking time, you get the expected 18+ hours of use. I’ve got it to last about 2 days which is great compared to the 18 hours as told by Apple. Charging the watch is very cool. You just set it on its small puck sized charger and it wirelessly, inductively charges. It takes only 2 hours to get a full charge with the 5W adapter that comes with it and only 1 hour with Apple’s 12W adapter (sold separately).

The Watch case and materials itself a surprisingly very durable. It still feels like something Apple didn't make and thats not a bad thing. It feels very premium on the wrist which nails the whole point Jonny Ive made when designing this watch. I have the Apple Watch Sport, 42 mm space gray which has the aluminum body and iON-X glass. For example, in wood shop at school, I hit the center of the screen on a metal machine by accident and it didn't even scratch it. I was astonished at its durability. The higher end model;s have the Sapphire Crystal screen which is expected to be even better. So no worries here. 

Already there are tons of apps for the watch in the app store. Surpassing Android Wear in just one day of available apps. So no problems finding any app. Android needs to step up its game if they want to compete. All in all this is a great new product that has the potential to live on. The Apple Watch is a game changer to the industry that will set new standards for other companies to follow. Whether you get the current Apple Watch, which is till on backorder after 4 million+ orders, or wait for the second generation. I assure you that you won’t be sorry with your purchase. I highly recommend the Apple Watch.

Follow Daniel Burns on Twitter, @DBurnsOfficial

Comments
    Picture

    Meet the Writer

    Daniel Burns is the co-owner of Adium Technologies and has been in the IT buisness since 2014. Currently pursing a masters degree in Cybersecurity Management at San Diego State University.  He occasionally shares his rant on technology and strives to help make the use of technology easier for the everyday user. You can follow him on Twitter for his latest likes, rants, and opinions.

    RSS Feed

    Past Posts

    November 2022
    May 2022
    December 2021
    April 2021
    December 2020
    March 2020
    January 2020
    December 2019
    November 2019
    October 2019
    September 2019
    August 2019
    July 2019
    May 2015

Company

Home
About Us
Pricing
Services

Support

Contact
Privacy Policy
Terms and Conditions

"Pronounced: Add-ie-um"
©2022 ADIUM TECHNOLOGIES. ALL RIGHTS RESERVED.
Daniel Burns and Jacob Byerline
CA ​Buisness License #026860
Insured by Progressive 
Insurance 
  • Contact
    • Social >
      • Facebook
      • Twitter - Adium
      • Instagram - Adium
      • Twitter - Daniel Burns
      • Instagram - Daniel Burns
      • Twitter - Jacob Byerline
      • Instagram - Jacob Byerline
      • Youtube
      • NoFlyZone
    • Amazon Home Services
    • Yelp
  • About
    • Our Team
    • Reviews
  • Pricing
  • Services
    • Computer Repair
    • TV
    • Internet
    • Phone
    • Security
    • Home Automation
    • Home Theater and TV
    • Custom Computers
  • Projects
    • Mataguay
  • Blog
  • Links
    • YouTube
    • Merch
    • Speedtest
    • TeamViewer
    • CCleaner
    • Carbon Copy Cloner
    • Clean My Mac
    • Memory Clean
    • VLC Player