Unlock the Power of Hybrid Cloud in Just 30 Minutes

As a seasoned IT professional, you know that staying ahead of the curve when it comes to technology is essential for your organization’s success. Whether you’re looking to modernize your data center, improve employee productivity, or enhance your customers’ experience, the right tools and expertise can make all the difference. That’s why we’re excited to introduce our latest offering: VMware Cloud on AWS 60-Day Hands-on Lab.

This comprehensive lab is designed to give you the hands-on experience you need to master VMware Cloud on AWS, one of the most powerful and flexible cloud platforms available today. With 60 days of access to our state-of-the-art lab environment, you’ll have ample time to explore all aspects of this cutting-edge technology and develop the skills you need to succeed in your organization.

VMware Cloud on AWS is a game-changer for IT professionals like you. It offers a seamless integration of VMware’s industry-leading virtualization technologies with the scalability and flexibility of Amazon Web Services (AWS). This means you can easily migrate your existing VMware workloads to the cloud, or build new applications from scratch using the same tools and frameworks you’re familiar with.

Our 60-Day Hands-on Lab is designed to help you master every aspect of VMware Cloud on AWS. You’ll have access to a wide range of topics, including:

* Installation and configuration of VMware Cloud on AWS

* Migrating existing workloads to the cloud

* Building new applications using familiar tools and frameworks

* Leveraging the power of AWS services such as storage, networking, and security

* Optimizing performance and cost-effectiveness in your cloud environment

Our lab is designed to be highly interactive, with a variety of hands-on exercises and projects that allow you to apply what you’ve learned in a real-world setting. You’ll have access to a range of virtual machines and tools, including VMware vSphere, vCenter Server, and AWS CloudFormation.

In addition to the comprehensive lab environment, we’re also offering a range of resources and support to help you succeed. These include:

* A dedicated community forum where you can ask questions, share knowledge, and collaborate with other participants

* Regular webinars and office hours with VMware and AWS experts

* Access to a range of learning resources, including video tutorials, documentation, and case studies

At XMSOFT, we’re committed to helping you succeed in your IT career. That’s why we’re offering this 60-Day Hands-on Lab at no cost to you. With our comprehensive lab environment, range of resources and support, and expert guidance from VMware and AWS experts, we’re confident that you’ll gain the knowledge and skills you need to master VMware Cloud on AWS and take your organization to the next level.

So why wait? Register for our 60-Day Hands-on Lab today and start exploring the power of VMware Cloud on AWS. With this comprehensive lab environment and expert guidance, you’ll be well on your way to mastering one of the most exciting and powerful cloud platforms available today.

Mastering Network Commands in VB.NET

In Visual Basic .NET (2005), there are several useful network commands that can be used to perform tasks such as pinging and performing DNS lookups, all from within the comfort of your own code. These commands are the equivalents of doing `ping` and `nslookup` from the command prompt, but with the added convenience of being able to call them directly from your code.

One of the most useful network commands in Visual Basic .NET is the `Ping` function. This function takes a single string argument, which is the IP address of the host you want to ping. Here’s an example of how you can use the `Ping` function in your code:

“`

Dim sIPAddress As String = “127.0.0.1” ‘ Replace with the IP address you want to ping

Dim ping As New Ping

Dim response As PingResponse

Set ping = New Ping

Set response = ping.Send(sIPAddress)

‘ Check if the host is reachable

If response.Status = IPStatus.Success Then

‘ Host is reachable, do something here

Else

‘ Host is not reachable, do something else here

End If

“`

This code creates a new `Ping` object and uses it to send a ping request to the specified IP address. The response from the server is then stored in the `response` variable, which you can check to see if the host is reachable or not.

Another useful network command in Visual Basic .NET is the `NSLookup` function. This function takes two string arguments: the first is the domain name or IP address you want to look up, and the second is the DNS server you want to use for the lookup. Here’s an example of how you can use the `NSLookup` function in your code:

“`

Dim sDomain As String = “example.com” ‘ Replace with the domain name or IP address you want to look up

Dim sDNSServer As String = “8.8.8.8” ‘ Replace with the DNS server you want to use for the lookup

Dim lookup As New NSLookup

Dim record As NSRecord

Set lookup = New NSLookup

Set record = lookup.GetAddrInfo(sDomain, sDNSServer)

‘ Check if the record exists

If record IsNot Nothing Then

‘ Record exists, do something here

Else

‘ Record does not exist, do something else here

End If

“`

This code creates a new `NSLookup` object and uses it to perform a DNS lookup on the specified domain name or IP address. The result of the lookup is stored in the `record` variable, which you can check to see if the record exists or not.

In conclusion, Visual Basic .NET provides several useful network commands that can be used to perform tasks such as pinging and performing DNS lookups, all from within the comfort of your own code. These commands are the equivalents of doing `ping` and `nslookup` from the command prompt, but with the added convenience of being able to call them directly from your code. By using these network commands in your Visual Basic .NET applications, you can easily perform common network tasks and improve the overall functionality of your code.

More and more countries banning smartphones in schools

The use of smartphones in schools has been a controversial topic for some time now. While some argue that they can be a valuable tool for learning, others claim that they are a distraction and hinder the academic performance of students. In recent years, several countries have implemented bans on smartphone use in schools, and the trend is gaining momentum.

In France, a national ban on smartphone use in schools was introduced in 2010, and it has been enforced in various forms since then. The latest measure, taken in 2018, prohibits the use of smartphones during school hours, including in the classrooms, corridors, and playgrounds. The ban applies to all students from three to 15 years old, and teachers are allowed to confiscate phones if they are used during lessons.

Italy also has a long-standing smartphone ban in schools, which was introduced in 2007. However, the current government has relaxed the rules, allowing students to use tablets and laptops under the supervision of teachers.

In the United Kingdom, the government has issued guidelines recommending that schools prohibit smartphone use during lessons and breaks. The guideline suggests that schools can confiscate phones if they are used during lessons, but it is not a legal requirement. According to a recent survey, 80% of British schools have already implemented such a ban.

Denmark, known for its innovative approach to education, has also joined the list of countries banning smartphones in schools. The Danish Ministry of Education advises schools to keep smartphones out of the classroom, but it is up to each school to decide how to enforce the policy. Some schools have introduced “phone hotels” where students can deposit their phones before lessons begin.

Portugal is experimenting with a similar approach, allowing schools to choose whether or not to ban smartphones. Some schools have implemented fixed days each month when students are not allowed to bring their phones to school.

Spain has also seen some regions impose a complete ban on smartphone use in schools, but the policy is not uniform throughout the country.

The debate around smartphone use in schools is not only about the distractions caused by these devices but also about the impact of screen time on children’s mental and physical health, as well as the potential for cyberbullying and online safety concerns. While some argue that a complete ban on smartphones may be extreme, others believe it is necessary to ensure that students can focus on their studies without distractions.

As technology continues to evolve and play an increasingly significant role in our daily lives, it will be interesting to see how schools and governments around the world navigate this complex issue in the future.

Virtual Admins

The Curious Case of Virtualization and Licensing

In today’s digital age, virtualization has become an essential aspect of IT infrastructure. With its numerous benefits, such as cost savings, increased efficiency, and improved scalability, it’s no wonder that more and more businesses are turning to virtualization to power their operations. However, a recent experience I had with a specialized application highlighted the potential pitfalls of relying solely on virtualization, particularly when it comes to licensing and the restrictions placed by software developers.

The story begins a couple of years ago when we were tasked with testing a new application that required a SQL Server Express instance, a proxy/licensing server, and client installation with license files. We thought it would be prime for running in a virtualized environment due to its simplicity and lack of resource intensity. As such, we offered to be a pilot customer and test it live in our environment. The testing confirmed what we thought – it worked perfectly when virtualized! We were happy, and so were the developers.

However, things took an unexpected turn when we received the final version of the software after the testing phase. When I went to install the proxy/licensing service, I discovered that the application developers had put checks in place to prevent installation on virtual machines! It turns out that they had added these checks to prevent the easy duplication of the proxy/licensing service in a virtual environment, which could potentially bypass their concurrent user license model.

Their reasoning for doing this was to prevent users from duplicating the proxy/licensing service in a virtual environment and thereby bypassing their concurrent user license model. However, in doing so, they effectively blocked us from implementing the solution as we wished. We are still stuck with a physical server running this service, even though it would be more efficient to run it virtually.

Now, I understand that software developers need to protect their intellectual property and ensure that their licensing models are enforced. However, in this case, the developers’ actions have caused us more harm than good. By blocking virtualization, they have made it more difficult for us to scale our environment efficiently and cost-effectively. It’s like they are trying to hold us hostage with their outdated licensing models.

I cannot help but wonder if the developers of this application were pirates in a previous life. Perhaps they were part of a secret society of virtualization-haters who sought to prevent the widespread adoption of virtualization technology. Whatever their reasons, their actions have made it clear that they are not on our side.

As someone who has embraced virtualization and its many benefits, I find it frustrating when software developers make it difficult for us to use their products in a virtualized environment. It’s like they are fighting against progress and innovation. In my opinion, virtualization is the future of IT infrastructure, and software developers need to adapt to this reality.

In conclusion, the story of our experience with this specialized application highlights the potential challenges that can arise when implementing a “virtualize first” strategy. While virtualization offers numerous benefits, it also introduces new challenges, such as licensing restrictions and the need for developers to adapt to new environments. As we move forward in this digital age, it’s essential that software developers understand the needs of their customers and work towards providing solutions that are flexible, scalable, and easy to implement.

So, was virtualization indeed created by pirates? Perhaps Dilbert said it best – “Virtualization was invented by pirates to make it easier to steal software.” But in all seriousness, I would rather be a ninja than a pirate any day. After all, who needs a pirate’s life when you can be a digital ninja, slicing through the competition with ease and agility?

The Virtual Admin Revolution

The Surprising Story of Virtualization and Pirates

As I read Rich Brambley’s recent post “A Pirate Invented Server Virtualization,” I couldn’t help but think of a peculiar story from my own production environment. This tale is a couple of years old, but sadly, it’s still relevant today. It revolves around a specialized application that we run in our environment, which requires a SQL Server Express instance, a proxy/licensing server, and client installation with license files to function properly.

This application may not be advanced or resource-intensive, making it an ideal candidate for virtualization. However, when we attempted to set it up in a virtualized environment, we encountered an unexpected roadblock. The application developers had put checks in place to prevent the proxy/licensing service from being installed on virtual machines!

After successfully testing and verifying the solution in our environment, the developer suddenly blocked us from implementing it as desired. The reason given was that the checks were put in place to prevent the easy duplication of the proxy/licensing service in a virtual environment, which could potentially bypass the concurrent user license model they had implemented.

The checks are based on a hardware ID generated by the physical hardware, making it difficult to duplicate the VMs. However, the developers could have worked around this issue by using the server DNS name or even the NIC MAC address as part of the hardware ID checks. Nevertheless, they opted to completely block the installation and operation of that particular part of their infrastructure if run virtualized.

This decision was not only bizarre but also unfair, as it limited our ability to utilize the application in a more efficient and cost-effective manner. We were stuck with a physical server for this service, even though it would have been much better suited for a virtual environment.

The irony of this situation is that the developers of this application, who should have embraced virtualization as a way to improve their product’s flexibility and scalability, instead chose to restrict its use. This reminds me of Dilbert’s comic strip where he jokes about pirates inventing virtualization. Perhaps there is some truth to that!

As a vNinja, I believe in embracing virtualization as a powerful tool for improving IT efficiency and agility. It allows us to break free from the limitations of physical hardware and create more flexible, scalable, and cost-effective infrastructures.

In conclusion, the story of the application developers who blocked the installation of their proxy/licensing service on virtual machines serves as a cautionary tale. It highlights the potential risks of relying solely on physical hardware and the importance of embracing virtualization as a key component of modern IT strategies.

As Dilbert would say, “Was virtualization indeed created by pirates?” Maybe so, but as a vNinja, I’ll continue to use virtualization to my advantage, even if some developers choose to ignore its benefits.

Dynamic Variable Creation with Conditional Logic in Microsoft Community Hub

Understanding Azure DevOps Pipelines: Best Practices and Common Issues

As a beginner in Azure DevOps, I recently encountered an issue while working with pipelines. Specifically, I was trying to pass variables to a template and reuse them in conditional expressions. However, I found that referring to the previously created variables did not work, and I was left with unreadable code and duplications. In this blog post, we will explore the best practices for working with Azure DevOps pipelines and how to overcome common issues like this one.

Why Referring to Previously Created Variables Does Not Work?

Before we dive into the solutions, let’s understand why referring to previously created variables does not work in Azure DevOps pipelines. The issue is related to the scope of variables in pipelines. In Azure DevOps, variables are defined within a specific scope, such as a pipeline or a stage. Once the scope is changed, the previous variables are no longer accessible.

For example, if you define a variable in a pipeline and then try to reuse it in another pipeline, it will not be available. This is because the second pipeline has its own set of variables that are defined within its scope. To overcome this issue, you need to pass the variables as inputs to the next pipeline or stage.

Best Practices for Working with Azure DevOps Pipelines

To avoid issues like the one I faced and to ensure a smooth experience working with Azure DevOps pipelines, here are some best practices to follow:

1. Use meaningful variable names: When defining variables in your pipeline, make sure to use meaningful names that clearly indicate their purpose. This will help you understand the code better and avoid confusion when referring to the variables later on.

2. Define variables at the beginning of the pipeline: It’s a good practice to define all variables at the beginning of the pipeline so that you can reuse them throughout the pipeline.

3. Use the correct scope for variables: Make sure to use the correct scope for your variables. For example, if you want a variable to be available across multiple stages, define it in the pipeline scope rather than the stage scope.

4. Pass variables as inputs: When passing variables from one pipeline to another, make sure to pass them as inputs rather than using the previous pipeline’s variables directly. This ensures that the variables are correctly passed and reduces the risk of issues like the one I faced.

5. Avoid duplication: To keep your code clean and maintainable, avoid duplicating variable definitions and conditional expressions. Instead, reuse the defined variables and expressions wherever possible.

6. Use comments and documentation: Finally, make sure to use comments and documentation throughout your pipeline code. This will help you and other developers understand the purpose of the code and make it easier to maintain.

Common Issues in Azure DevOps Pipelines and How to Overcome Them

In addition to the issue I faced, there are several other common issues that you may encounter when working with Azure DevOps pipelines. Here are some solutions to overcome these issues:

1. Error: “The template is not defined” – This error occurs when you try to use a template that has not been defined in your pipeline. To resolve this issue, make sure to define the template before using it.

2. Error: “The variable is not defined” – This error occurs when you try to use a variable that has not been defined in your pipeline. To resolve this issue, make sure to define the variable before using it.

3. Error: “The file/directory is not found” – This error occurs when you try to access a file or directory that does not exist in your pipeline. To resolve this issue, make sure to specify the correct path to the file or directory.

4. Error: “The variable is not of the expected type” – This error occurs when you try to use a variable that has an unexpected data type. To resolve this issue, make sure to define the variable with the correct data type.

5. Error: “The pipeline failed due to a syntax error” – This error occurs when there is a syntax error in your pipeline code. To resolve this issue, make sure to check your code for any syntax errors and fix them before running the pipeline again.

Conclusion

In conclusion, working with Azure DevOps pipelines can be challenging, especially for beginners. However, by following best practices and understanding common issues, you can overcome these challenges and ensure a smooth experience when working with pipelines. Remember to use meaningful variable names, define variables at the beginning of the pipeline, use the correct scope for variables, pass variables as inputs, avoid duplication, and use comments and documentation throughout your code. By following these practices, you can avoid common issues like referring to previously created variables not working and ensure that your pipelines are reliable, maintainable, and efficient.

Unveiling the Power of vRealize Network Insight 5.0

VMware vRealize Network Insight 5.0: Enhancing Network Visibility and Control

VMware has recently announced the release of vRealize Network Insight 5.0, a comprehensive network monitoring and management solution that offers a plethora of new features to enhance network visibility and control. This latest version introduces integrations with VMware SD-WAN by VeloCloud and Public Cloud Azure, along with other major features such as Network Flow Round Trip Time (RTT) metrics with NSX-T, Kubernetes topologies, and application discovery.

Enhanced Integration with VMware SD-WAN by VeloCloud

One of the most significant enhancements in vRealize Network Insight 5.0 is the integrated support for VMware SD-WAN by VeloCloud. This integration enables organizations to gain better visibility and control over their WAN infrastructure, including the ability to monitor and troubleshoot network performance, application traffic, and security policies. With this feature, IT teams can now easily deploy and manage their SD-WAN environments from a single platform, simplifying network management and reducing operational costs.

Public Cloud Azure Integration

Another exciting feature in vRealize Network Insight 5.0 is the integration with Public Cloud Azure. This integration enables organizations to extend their on-premises networks into the cloud, providing seamless connectivity and network services between their on-premises infrastructure and Azure public cloud. With this feature, IT teams can now easily manage their hybrid cloud environments, ensuring consistent networking policies and security across their entire infrastructure.

Network Flow Round Trip Time (RTT) Metrics with NSX-T

vRealize Network Insight 5.0 also introduces Network Flow Round Trip Time (RTT) metrics with NSX-T. This feature enables organizations to gain better insights into their network performance, including the time it takes for data packets to travel from the source to the destination and back. With this information, IT teams can now identify bottlenecks and optimize their network configurations for improved performance and scalability.

Kubernetes Topologies and Application Discovery

The latest version of vRealize Network Insight also includes Kubernetes topologies and application discovery. This feature enables organizations to gain better insights into their containerized applications, including the networking and security policies that govern their behavior. With this information, IT teams can now easily identify and troubleshoot issues related to containerized applications, ensuring optimal performance and security.

Conclusion

VMware vRealize Network Insight 5.0 represents a significant leap forward in network monitoring and management capabilities. With its enhanced integrations with VMware SD-WAN by VeloCloud and Public Cloud Azure, organizations can now gain better visibility and control over their hybrid cloud environments. Additionally, the introduction of Network Flow RTT metrics with NSX-T, Kubernetes topologies, and application discovery provides IT teams with the tools they need to optimize network performance, security, and scalability. With vRealize Network Insight 5.0, organizations can now confidently embrace the complexities of modern networking and cloud adoption, while ensuring optimal performance and security for their applications and data.

Effortlessly Exporting CSV Files from ASP Pages

Exporting Data from an ASP Page to Microsoft Excel

In my recent project, I had to write an ASP page that reads data from a SQL database and exports it to Microsoft Excel. The user clicks on a link to this ASP page, and Excel opens up with a new spreadsheet containing the data. In this blog post, I will share how I accomplished this task, and the techniques I used to ensure seamless compatibility with different versions of Excel.

Before we dive into the technical details, let me clarify that we are dealing with CSV (comma delimited text) files here. This means that the data is exported as plain text, which can be opened in any spreadsheet program, not just Microsoft Excel. However, for the sake of simplicity, I will refer to Excel throughout this post, since it is the most widely used spreadsheet application.

The First Step: Creating an ASP Page

To start, we need to create an ASP page that connects to our SQL database and retrieves the data we want to export to Excel. This can be done using ADO (ActiveX Data Objects), which is a set of components that allow developers to interact with databases and other data sources.

Here’s a brief overview of how to create an ASP page that connects to a SQL database:

1. Create a new ASP project in Visual Studio or your preferred IDE.

2. Add a reference to the ADODB library, which is required for interacting with databases.

3. Use the ADODB connection object to connect to your SQL database and retrieve data.

4. Use the ADODB recordset object to iterate through the retrieved data and export it to CSV format.

The Second Step: Exporting Data to CSV Format

Now that we have our ASP page set up, we need to write the code that exports the data to CSV format. This can be done using a variety of techniques, but I prefer to use the ADODB recordset object’s “Open” method to specify the output file as a CSV file.

Here’s an example of how to export data to CSV format using ADODB:

“`

dim conn as ADODB.Connection

dim rs as ADODB.Recordset

‘ Connect to the SQL database

Set conn = New ADODB.Connection

conn.Open “DRIVER={MySQL ODBC 8 ANSI Driver};SERVER=my-server;DATABASE=my-database;USER=my-username;PASSWORD=my-password”

‘ Create a new recordset to store the data

Set rs = New ADODB.Recordset

‘ Fill the recordset with data from the SQL database

rs.Open “SELECT * FROM my_table”, conn

‘ Export the data to CSV format

With rs

.Export To=”my-data.csv”

.OpenWrite

.Write “csv,text”

End With

‘ Close the recordset and connection

Set rs = Nothing

conn.Close

“`

This code connects to a SQL database, creates a new recordset, fills the recordset with data using the “Open” method, and then exports the data to CSV format using the “Export To” method. The “OpenWrite” method is used to specify the output file as a CSV file, and the “Write” method is used to write the data to the file.

The Third Step: Handling Different Versions of Excel

Now that we have our ASP page set up to export data to CSV format, we need to ensure that the data can be opened in different versions of Excel. This is achieved by using a simple technique called “header injection,” which involves adding a custom header to the CSV file that tells Excel how to interpret the data.

Here’s an example of how to add a custom header to a CSV file using ADODB:

“`

With rs

.Export To=”my-data.csv”

.OpenWrite

.Write “csv,text;header=1”

End With

“`

This code adds the custom header “header=1” to the CSV file, which tells Excel to interpret the data as a table with the first row as column headers. This ensures that the data is properly formatted and easy to read when opened in different versions of Excel.

Conclusion

In this blog post, we covered how to export data from an ASP page to Microsoft Excel using CSV format. We also discussed how to handle different versions of Excel by adding a custom header to the CSV file. By following these steps, you can easily create an ASP page that exports data to Excel and ensures seamless compatibility with different versions of the software.

Apple TV+ Hit ‘Severance’ Scores Second Season Renewal

Severance: The Long Wait for Season 2 Continues

It’s been over a year since the first season of the critically acclaimed sci-fi series Severance concluded, and fans have been eagerly awaiting the return of the show. Unfortunately, the production has been plagued by delays, and Apple has now announced that the second season will premiere on January 17th, more than 30 months after the first season finale in the spring of 2022.

The show, which is set in a dystopian world where employees at a mysterious company called Lumon undergo a process called “severance” that separates their work and personal lives, has been praised for its innovative storytelling and retro-futuristic design. The first season ended with the severance barrier being broken, and the second season is expected to explore the consequences of this event.

The ensemble cast, led by Adam Scott, Britt Lower, Tramell Tillman, Zach Cherry, Jen Tullock, Michael Chernus, Dichen Lachman, and John Turturro, will return for the second season, along with Oscar-winning stars Christopher Walken and Patricia Arquette. New to the cast is Sarah Bock, who has been promoted to series regular.

Despite the excitement around the show’s return, there have been reports of tension behind the scenes, with showrunners Mark Friedman and Dan Erickson allegedly clashing, and cost issues causing delays. According to a report by Puck News, Friedman was on the verge of leaving the show before being convinced by producer Ben Stiller to stay.

The wait for Severance season 2 has been long, but fans can take comfort in the fact that the show will continue to explore its unique and thought-provoking themes when it returns on January 17th. The show will air weekly until March 21st, and Mac & i will provide updates and coverage of each episode as they air.

So, what can we expect from Severance season 2? Will the consequences of breaking the severance barrier be catastrophic or a new beginning for the characters? Will we see more of the mysterious Lumon company and its intentions? The wait is almost over, and fans can hardly contain their excitement to find out what’s next for this beloved show.

How vNinja is Revolutionizing Virtualization Management with VMware

It’s always exciting to see one’s work being recognized and appreciated by others, especially when it comes to a respected company like VMware. Recently, I was made aware that my post on vSphere 4.1 to 4.0 differences, which was featured on vNinja, has been used in VMware’s internal presentation material. Specifically, pages 44 and 45 of the vSphere 4.1 Deep Dive – Part 1 – v6.pptx presentation feature my post and screenshots.

I must say that I am both honored and humbled by this recognition. It’s gratifying to know that my content, which was created with the intention of helping others, has been found useful enough to be included in VMware’s internal resources. However, I do wish that I had been notified and asked for permission beforehand. While Iwan Rahabok, the creator of the presentation, did provide links and source attribution, a direct message from him would have been great as well.

As a blogger and content creator, it’s important to me that my work is used responsibly and with proper attribution. While I understand that sometimes mistakes can happen, I believe that open communication and transparency are essential in maintaining good relationships and avoiding any potential conflicts.

In this case, while I am thrilled that my content was featured in VMware’s presentation, I would have appreciated a direct message or notification from Iwan or someone else involved in the creation of the material. This would have given me an opportunity to provide additional context, clarify any misunderstandings, or simply say thank you for finding my work helpful.

Despite this, I am still grateful for the recognition and appreciation that my work has received from VMware and others in the virtualization community. It’s what motivates me to continue creating valuable content and sharing my knowledge with others. And who knows, maybe one day I’ll have the opportunity to collaborate directly with VMware or other industry leaders on future projects.

In conclusion, while I would have liked to be notified and asked for permission beforehand, I am still proud to see my work being used in VMware’s internal resources. It’s a testament to the value of vNinja’s content and the importance of sharing knowledge and expertise within the virtualization community. Thank you to everyone who has supported me on this journey so far, and I look forward to continuing to create valuable content for years to come.