Welcome!

SDN Journal Authors: Liz McMillan, Yeshim Deniz, Elizabeth White, Pat Romanski, TJ Randall

Related Topics: Containers Expo Blog, Java IoT, Microservices Expo, Open Source Cloud, @CloudExpo, Cloud Security

Containers Expo Blog: Article

8 Software Testing Demons that Service Virtualization Can Exorcise

Learn How Service Virtualization Can Help You Exorcise Your Scariest Software Testing Demons

Ever had a frightful encounter with the following testing demons? They tend to lurk around complex interconnected systems, just waiting to wreak havoc by forcing you to delay testing, work at dreadful hours, or make distressing trade-offs on the completeness of your testing...

service virtualization incomplete systems

Evolving/incomplete systems
You're ready to test the part of the system that you're responsible for, but you can't really exercise it unless you interact with other system parts that are still evolving-or not yet even implemented.

 

 

 

service virtualization inaccessible systemsInaccessible systems
This includes those dependent systems available for testing only from 3am - 5am on Saturday morning. They're a close cousin to the evolving/incomplete systems: the reason you can't access them is different (e.g., security restrictions, "geopolitical" boundaries, etc.), but the impact on your testing is the same.

 

 

 

service virtualization performanceUnrealistic performance
Staging environments commonly lack the computing resources required to deliver realistic performance from downstream systems and/or to emulate complex network factors such as bandwidth, latency, and jitter. Testing vs. unrealistic conditions leads to nasty surprises later-when the environment doesn't accurately represent real-world conditions, performance testing could result in false assurances.

 

 

service virtualization provisioning delaysProvisioning delays
Sure, you get the test environment provisioned with all the dependencies you need to exercise...you'll just have to wait a few weeks to get it configured to your liking and stood up. By that time, your team is likely to be on the next iteration.

 

 

 

 

service virtualization test conditions

 

Test conditions that are difficult to achieve
To achieve the expected level of test coverage, you often need to see how having dependencies configured for various edge, error, or failure conditions impacts the AUT. But good luck getting these difficult-to-produce conditions configured-especially if you have limited access to (or control over) the dependency.

 

 

 

service virtualization 3rd party fees

Third-party access fees
Pay-per-use fees for cloud-based or shared services such as payment card processing, credit checks, etc. might be an expected expense for production usage. However, these costs can escalate quickly for continuous testing or high-volume performance testing.

 

 

 

service virtualization team test environment accessOther teams working on shared test environments
It can take hours or sometimes even days to get a test environment configured exactly how you like it-then another team comes in and re-configures it to suit their needs. You can't blame them, but it's frustrating nevertheless.

 

 

 

service virtualization mainframe accessMainframe access
Developing and testing applications that leverage a mainframe environment is commonly a complex, costly & time-consuming endeavor. Factors such as complexity of access, the cost of MIPS consumption & the operational cost/delays involved in making changes to mainframe components make all mainframe-related testing extremely frightening to both the tester and the mainframe experts.

 

 

 

How Service Virtualization Helps Exorcise These Software Testing Demons
By exorcising-well, virtualizing-these demons through the power of service virtualization, you can test earlier, faster, and more completely.

Service virtualization is a new way to provide developers and testers the freedom to exercise their applications in incomplete, constantly evolving, and/or difficult-to-access environments. It gives you flexible 24/7 access to the dependent application behavior you need in order to complete your development and testing tasks. Teams taking advantage of service virtualization are able to:

  • Start testing whenever they're ready.
  • Rapidly configure the environment conditions critical to their test plan.
  • Complete the desired breadth and volume of tests.
  • Confidently promote the application under test to the next level.

Watch this 1-minute introduction for more details...

Hungry for more? Go trick-or-treating in Parasoft's Service Virtualization Resource Center.

More Stories By Cynthia Dunlop

Cynthia Dunlop, Lead Content Strategist/Writer at Tricentis, writes about software testing and the SDLC—specializing in continuous testing, functional/API testing, DevOps, Agile, and service virtualization. She has written articles for publications including SD Times, Stickyminds, InfoQ, ComputerWorld, IEEE Computer, and Dr. Dobb's Journal. She also co-authored and ghostwritten several books on software development and testing for Wiley and Wiley-IEEE Press. Dunlop holds a BA from UCLA and an MA from Washington State University.

CloudEXPO Stories
Despite being the market leader, we recognized the need to transform and reinvent our business at Dynatrace, before someone else disrupted the market. Over the course of three years, we changed everything - our technology, our culture and our brand image. In this session we'll discuss how we navigated through our own innovator's dilemma, and share takeaways from our experience that you can apply to your own organization.
DXWorldEXPO LLC announced today that Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, software-defined solution with rich machine intelligence.
Founded in 2002 and headquartered in Chicago, Nexum® takes a comprehensive approach to security. Nexum approaches business with one simple statement: “Do what’s right for the customer and success will follow.” Nexum helps you mitigate risks, protect your data, increase business continuity and meet your unique business objectives by: Detecting and preventing network threats, intrusions and disruptions Equipping you with the information, tools, training and resources you need to effectively manage IT risk Nexum, Latin for an arrangement by which one pledged one’s very liberty as security, Nexum is committed to ensuring your security. At Nexum, We Mean Security®.
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
The Transparent Cloud-computing Consortium (T-Cloud) is a neutral organization for researching new computing models and business opportunities in IoT era. In his session, Ikuo Nakagawa, Co-Founder and Board Member at Transparent Cloud Computing Consortium, will introduce the big change toward the "connected-economy" in the digital age. He'll introduce and describe some leading-edge business cases from his original points of view, and discuss models & strategies in the connected-economy. Nowadays, "digital innovation" is a big wave of business transformation based on digital technologies. IoT, Big Data, AI, FinTech and various leading-edge technologies are key components of such business drivers.