Problem Overview ? The Context of the Problem Large technolo
Problem Overview
The Context of the Problem
Large technology projects, whether the development of new technologies or upgrading current systems or software applications, can be costly. In larger organizations, a million-dollar (or more) project is not unusual. Once a project is rolled out to production, it is important to evaluate the performance of the project. This is generally a comparison of the anticipated benefits used in making the decision to move forward with the project versus the actual performance of the systems of software once in use. Various methods may be used to evaluate the performance; however, it is important to develop a broad set of standards for making an assessment of the systems or software.
The Problem
Your organization has made a very large investment in the purchase of infrastructure or development of an in-house software application. As examples, the network infrastructure has had a hardware refresh, business analysis data tools have been implemented, or a new customer resource management software tool has been implemented. Your team must assess the performance of the newly launched technology. You will be providing the various stakeholders (user community, project managers, and senior leadership) with the plan to be used for conducting the performance assessment, including the process of collecting performance data, analysis methods, and an explanation of the appropriateness of the methods to be used (the data may be concocted or gathered from a representative system).
As you work through the problem, be sure to focus on reaching these learning outcomes:
- Optimize organizational processes using data analysis.
- Assess the potential of various software to enhance organizational performance.
- Evaluate applications for the potential to improve collaboration, sharing, and lowering cost.
- Manage application development to lower cost and improve quality and customer satisfaction.
- Maximize the return on organizational technology investments.
- Develop application policies and procedures consistent with the Virtuous Business Model.
- Assess the challenges, technologies, and system approach issues in developing and deploying applications.
Deliverable
Step 5: Performance Assessment and Change Management Plan
Presenting the Solution
Your deliverable for Step Five is a Performance Assessment—a final report that builds on the deliverables you created in Steps Two through Four.
The final report to stakeholders related to the performance of a technology investment will depend on what constitutes an actionable performance report. What must the stakeholders be able to use in determining success? In a technology environment, this can often be difficult to define. Stakeholders may not have enough of a technical background to understand measurements from a system. Qualitative measures may be easier to recognize than bytes or CPU cycles. Technical measurement must often be reported in layperson terms. The team will need to assess the data and develop a means to report the data when providing a solution.
The team will need to determine what makes the technology investment successful in the eyes of the stakeholders. The team will need to translate the measurement results to a narrative analysis while providing understandable measurements as evidence of success.
The Change Management Plan to Implement the Solution
Good problem solvers know that finding the solution is not the end of the process; the solution must be implemented. Some solutions are easy to implement, and others can be very challenging. All solutions, however, require a change. It is possible that the investment in technology has not produced the quality of performance originally hoped for. In such cases, good leaders know that managing change requires a strategy. In this course, we will assume your team experiences both success, as measured by performance criteria, and less-than-successful results, in which case your team must reach a conclusion on the solution in the form of recommendations to resolve performance. Now you must develop a brief Change Management Plan to implement the solution.
Instructions for Deliverable
- As a team, prepare a Performance Assessment report that shows the investment in technology was a success according to the results of the solution testing. The report will follow this structure:
- Professional cover page
- List of contributors
- Table of contents
- Executive summary
- An executive summary is designed primarily to serve the person who, at least initially, does not intend to read the entire report. It usually states the main points of each section and emphasizes results, conclusions, and recommendations, usually in around three pages.
- Executive summaries are ideally suited to the needs of leaders who are seeking advice about a decision or a course of action. These summaries are called executive summaries because some decision-makers rely wholly upon their advisors to read and evaluate the rest of the report.
- For the purposes of this assignment, the Executive Summary should be no more than two pages and should concentrate on the background of the problem, test measures conducted, and a summary of findings.
- Organized sections of findings (for example, goals of the testing, action steps, protocols, resources, definitions of terms, etc.)
- A separate section in which you develop a set of recommendations, assuming now that the findings of the measurement research did not support a successful investment. You should offer some potential recommendations to mitigate the failed project.
- The Change Management Plan
- References in APA Style(new tab) format (for scholarly or practitioner resources that are cited or used within the plan)
- Appendices (OPTIONAL—attached materials, tools, documents, samples, templates, etc. that are part of the solution)
- As a team, create a briefing in PowerPoint to accompany your team report. The PowerPoint briefing must include the following:
- Statement that summarizes the report
- Brief description of how measures were determined for use in the evaluation
- Justification for reporting the success of the project
- Conclusion that summarizes success based on the measurements
- A brief set of recommendations if the project were deemed to have missed the mark on performing as anticipated
- The Change Management Plan
- If any resources are cited in the report, use APA style to format in-text citations and the references list.
6
Introduction
An evaluation of a recently introduced piece of technology should begin with a self-assessment of the team's skills and experience. This will help to identify the best people for each position on the team. The team should then devise a plan for delegating measurement tasks, ascertain that there is a leader for the team, and make certain that each member of the team is aware of their place in the project. As soon as the plan is in place, the group needs to begin collecting data on the effectiveness of the recently introduced technology. This information can be gathered through a wide number of approaches, such as questionnaires, interviews, focus groups, and even just plain old watching and waiting (Jalal, 2017). The team should then perform an analysis of the data to determine areas that could be improved and give suggestions for alterations.
Plan of Action
The creation of a plan of action is the initial stage in carrying out a performance evaluation of a recently introduced piece of technology. A self-evaluation of the members of the team's background experience and talents should be included in the plan. This should be followed by a review of each member's replies to the self-evaluation, and then a conclusion of which people are the best fits for each position. If there is a problem with the team's knowledge of a particular measure, they need to reach an agreement on which member of the team is best suited to conduct research on the measurement features. In addition, the team should devise a plan for delegating measurement tasks, ascertain that there is a leader for the team, and make certain that each member of the team is aware of their place in the project. As soon as the plan is in place, the group needs to begin collecting data on the effectiveness of the recently introduced technology (Jalal, 2017). This information can be gathered through a wide number of approaches, such as questionnaires, interviews, focus groups, and even just plain old watching and waiting. The team should then perform an analysis of the data to determine areas that could be improved and give suggestions for alterations.
Self-assessment of background skills and experience
In order for us to conduct an accurate evaluation of how well the new technology works, we will need to analyze the knowledge and expertise that our team possesses in the areas of data analysis, software development, and project management. In order to conduct a comprehensive assessment, our staff ought to have a variety of expertise and practical experience under their belts. Projects involving technology can rack up significant costs, therefore it is essential to evaluate how well they operate in order to determine whether or not the investment was worthwhile (Hall, 2011). There are a lot of approaches to data collection regarding performance; nonetheless, it is essential to construct a comprehensive set of criteria prior to completing an evaluation. When all of the necessary information has been gathered, it must then be examined to assess whether or not the technology is living up to the standards that were established prior to the decision to proceed forward with the project.
Team member 1:
In addition to having a good foundation in mathematics and statistics, the member also have expertise in doing data analysis. He is certain that he will be able to devise the appropriate procedure for accumulating performance data and carrying out the analysis on my own. In addition to this, he has extensive expertise in the creation of rules and processes, as well as a solid comprehension of the Virtuous Business Model.
Team member 2:
The member has a solid background in project management and a thorough understanding of how to evaluate the possibilities of different technologies to improve the performance of a business. In addition to this, he has extensive experience managing application development in order to reduce costs while simultaneously improving quality and customer happiness. He is certain that he will be able to devise the appropriate procedure for accumulating performance data and carrying out the analysis.
Review of Self-Assessment.
The major objective of the evaluation is to examine and contrast the expected advantages of the new technology project with the actual performance of the systems or software once it has been put into operation. This will assist in determining if the investment was worthwhile and whether the implementation of the new technology is truly increasing the performance of the firm (Jalal, 2017).
In order to evaluate how well the new technology really works, performance data will be gathered from a sample population that is intended to be representative of all users. This information may contain measurements such as levels of satisfaction, times required to complete tasks and mistake rates. Depending on the exact data that has to be gathered, the collection of the performance data can be done using a variety of different approaches. For instance, data on levels of satisfaction may be gathered through the use of surveys, while information on the length of time required to complete tasks can be gleaned through user logs or other types of system data (Hall, 2011).
The information will be evaluated so that a contrast may be made between the predicted benefits of the new technology and its actual performance. This might include qualitative approaches such as interviews or focus groups, in addition to quantitative ones like as statistical analysis. As a result of the evaluation, a report on the effectiveness of the newly developed technology is anticipated to be produced as its end result. With the help of this report, we will establish whether or not the investment was worthwhile, as well as whether or not the implementation of the new technology is truly helping to improve organizational performance.
Strategy for assigning measurement tasks
The approach that the team will use to allocate measurement tasks should be created in such a manner that every member of the team is aware of their place in the project as well as the qualities they possess that will contribute to the accomplishment of the project. It is the responsibility of the person in charge of the team to make sure that every member of the team is aware of their function and how it fits into the larger picture. Every member of the team should be assigned a particular job to evaluate, and they should be in charge of gathering data and reporting their findings to the person in charge of the team (Hall, 2011). After then, the leader of the team is responsible for compiling the data and reporting back to the stakeholders. The group must to have a well-defined plan for delegating the various measuring duties. The leader of the team is responsible for delegating responsibilities to each member of the team and ensuring that everyone is aware of their place in the overall project. The group have to have an in-depth discussion on the characteristics that they will be assessing.
Conclusion
An accurate evaluation of new technology is essential to determine whether or not the investment was worthwhile. In order to get an understanding of how well the new technology works, performance data will be gathered from a sample population that is intended to be representative of all users (Hall, 2011). This information may contain measurements such as levels of satisfaction, times required to complete tasks, mistake rates, and so on. The information will be evaluated so that a contrast may be made between the predicted benefits of the new technology and its actual performance. This might include qualitative approaches such as interviews or focus groups, in addition to quantitative ones like as statistical analysis. As a result of the evaluation, a report on the effectiveness of the newly developed technology is anticipated to be produced as its end result.
References
Hall, J. A. (2011). Information Technology Software Auditing and Assurance. Retrieved from https://strayer.vitalsource.com
Jalal K. (2017). Software Infrastructure to Reduce the Cost and Time of Building Enterprise Software Applications: Practices and Case Studies. Retrieved from: https://www.researchgate.net/publication/322267534_Software_Infrastructure_to_Reduce_the_Cost_and_Time_of_Building_Enterprise_Software_Applications_Practices_and_Case_Studies
,
7
Introduction
The primary purpose of conducting a performance assessment of newly launched technology is to ensure that the technology is meeting the needs of the organization and providing the expected benefits. This assessment can help identify any areas where the technology is not performing as intended and may need to be adjusted or improved (Jalal, 2017). Additionally, the performance assessment can help identify any potential issues that could impact the future use of the technology. This paper therefore will therefore discuss how to conduct the performance assessment by various stakeholders.
Optimize organizational processes
The optimization of organizational processes may be accomplished in a number of different ways by making use of data analysis. Utilizing data to uncover bottlenecks and inefficiencies in existing processes is one approach that may be used. This may be accomplished by monitoring certain aspects of the process, such as cycle durations, completion rates and mistake rates and then searching for patterns in the data that point to the locations of the problematic occurrences (McShane, 2018). Data may also be utilized to build and test new process designs, which is another way that it can be used to optimize existing processes. This may be accomplished by employing data simulation to evaluate how multiple process designs might influence factors such as cycle durations, completion rates, mistake rates, and other such factors, and then selecting the design that is the most successful.
Software to promote organizational performance
There are many different software programs that have the ability to improve the performance of a business. They include:
· Customer Relationship Management Software, often known as CRM, is a sort of program that may assist businesses in keeping track of and managing customer data, as well as sales data and marketing data (Jalal, 2017). This may help businesses better understand their consumers, enhance the targeting of their marketing efforts, and increase the percentage of leads that are converted into sales.
· Enterprise Resource Planning Software, often known as ERP, is a type of program that enables businesses to more effectively manage, monitor, and analyze their operational data and procedures. This can assist to increase both the efficiency of operations and the quality of decisions.
· Business Intelligence (BI) software: This category of software enables companies to gain a deeper comprehension of their data and improve the quality of the judgments they reach as a result (Jalal, 2017). The use of business intelligence software may assist companies in monitoring and analyzing trends, locating areas in need of development, and making more informed decisions.
Applications for the potential to improve sharing, lowering cost and collaboration.
There are a lot of apps that have the ability to make things better in terms of working together, sharing, and cutting costs. They include:
· Cloud-based software may be used to store documents and data online, making them available from any location and making it simple to share them with others. Real-time collaboration, such as that offered by Google Docs, is another application that may make advantage of these capabilities (Mustafa, 2020).
· Video conferencing and webinars: These may be used to conduct virtual meetings, which can reduce the costs associated with travel. Sharing presentations and other types of content is another possible application for them.
· Project management Software: These may be used to monitor and keep track of tasks, deadlines, and overall progress on projects. In addition, they may be used to delegate and monitor the progress of tasks, both of which can contribute to an increase in overall collaboration and communication (McShane, 2018).
Application procedures and policies consistent with the Virtuous Business Model
Organizations should develop policies and procedures that are consistent with the Virtuous Business Model in order to maximize the return on their technology investments (Mustafa, 2020). The Virtuous Business Model is a framework that organizations can use to assess the performance of their technology investments. The model consists of four dimensions: value, quality, customer satisfaction, and organizational impact.
Value refers to the extent to which the technology investment has increased the organization's revenue or decreased its costs (Jalal, 2017). Quality refers to the extent to which the technology investment has meet the organization's performance expectations. Customer satisfaction refers to the extent to which the technology investment has improved the organization's customer satisfaction levels. Organizational impact refers to the extent to which the technology investment has had a positive impact on the organization's overall performance.
To assess the performance of a technology investment using the Virtuous Business Model, organizations should collect data on each of the four dimensions. They should then use appropriate analysis methods to analyze the data and identify areas of improvement. Finally, they should develop policies and procedures that are designed to maximize the return on the technology investment (McShane, 2018).
Challenges system approach and technology issues in the development and deploying software
There are a few challenges that need to be considered when developing and deploying applications. One challenge is ensuring that the application is compatible with the various devices and operating systems that users may be using. Another challenge is ensuring that the application is able to scale appropriately to handle increased traffic or usage. Additionally, security is a major concern when developing and deploying applications, as sensitive data may be at risk of being compromised.
When it comes to technology, there are a few different options that can be used when developing and deploying applications. One option is to use a cloud-based platform, which can provide flexibility and scalability. Another option is to use a container-based platform, which can provide isolation between different applications. Additionally, there are various programming languages that can be used to develop applications, each with its own strengths and weaknesses.
System approach issues need to be considered when developing and deploying applications. One issue is how the application will be deployed, whether it will be deployed on-premises or in the cloud. Another issue is how the application will be monitored, as this will be important for troubleshooting and performance optimization. Additionally, it is important to consider how updates will be deployed, as this will need to be done in a way that minimizes downtime and disruption.
Evaluation
Technical attributes may include measures such as system uptime, response time, throughput, or error rates (McShane, 2018). Behavioral attributes may include measures such as customer satisfaction, number of support tickets, or number of calls to the help desk. System uptime is a measure of how often the system is available for use. Response time is a measure of how quickly the system responds to requests. Throughput is a measure of how much work the system can handle. Error rates are a measure of how often the system produces incorrect results.
Customer satisfaction is a measure of how satisfied users are with the system (Mustafa, 2020). The number of support tickets is a measure of how many problems users are having with the system. The number of calls to the help desk is a measure of how many users are having problems with the system.
References
Jalal K. (2017). Software Infrastructure to Reduce the Cost and Time of Building Enterprise Software Applications: Practices and Case Studies. Retrieved from: https://www.researchgate.net/publication/322267534_Software_Infrastructure_to_Reduce_the_Cost_and_Time_of_Building_Enterprise_Software_Applications_Practices_and_Case_Studies
McShane, S., & Von Glinow, M. (2018). Organizational Behavior (McGraw-Hill. 8th ed.,) Boston, MA
Mustafa B. (2020). Service Quality as a Profit Strategy in Marketing: The Service-profit Chain Model. European Journal of Service Management.
,
1
Introduction
The objective of this study plan is to evaluate the viability of our solution in relation to previously conducted test cases for companies operating in industries analogous to those of our own. In this section, we will concentrate on the manner in which these use cases measure the performance characteristics of various technical and behavioral qualities connected with an investment in technology made on behalf of a business. The viewpoints and data sources of stakeholders will be incorporated into our measuring system. This measurement framework will be utilized by us in order to assess and analyze the overall performance of our product. After the solution has been implemented, we will conduct post-implementation evaluations to determine how the solution affected the organization. The management of change will play a significant role in our overall research agenda. The plan will adhere to a certain format in providing the findings of the data analysis.
Measurement framework
In order to present an all-encompassing picture of performance, the measuring framework must to take into account the many stakeholder viewpoints as well as the various data sources. Perspectives from stakeholders may come from a variety of sources, such as the user community, project managers, or senior leadership. Customer feedback, system logs, and performance statistics are three examples of potential data sources (Thabane, 2009).
The purpose of the measurement framework is to supply stakeholders with viewpoints and data sources that may be utilized to evaluate the effectiveness of an investment in technology. The framework consists of four dimensions: behavioral characteristics, organizational aspects, user factors, and technological qualities (McShane, 2018). To evaluate how well the technology investment is working out, there is a separate set of performance indicators linked with each of the dimensions of the evaluation.
Indicators such as system uptime, reaction time, and throughput are examples of technical qualities. Indicators that make up behavioral qualities include things like user happiness, adoption rates, and the costs of training. Indicators like as return on investment (ROI) and total cost of ownership are included in the category of organizational variables (TCO). The metrics that make up user factors include things like user happiness, adoption rates, and training expenses (McShane, 2018).
The measuring framework draws its information from a variety of data sources, including organizational data, user data, performance data, and financial data. The return on investment (ROI) and total cost of ownership (TCO) of the technological investment may both be calculated using financial data (Jalal, 2017). The uptime, reaction time, and throughput of the system may all be evaluated based on the performance statistics. Data from users may be analyzed to determine factors such as user happiness, adoption rates, and the costs of training (Thabane, 2009). Data from the organization may be analyzed to provide insight into aspects of the organization such as its culture, structure, and procedures.
Effect of measurement on the general performance evaluation
There will be a variety of ways in which each measurement will influence the overall performance evaluation. For instance, if you are evaluating the performance of a brand-new software application, the data that you collect will demonstrate how effectively the application functions as well as the degree to which users are pleased with it. Using this information, you will be able to evaluate whether or not the application successfully satisfies the requirements of the organization and whether or not purchasing it was a wise financial decision (McShane, 2018).
There are a few different metrics that may be used to evaluate how successful an investment in technology has been. The most typical approach is to examine the manner in which the technology is being utilized. Tracking the number of users, the amount of time spent utilizing the technology, and the quantity of work that has been finished are all ways to accomplish this goal. Monitoring the amount of slip-ups and mishaps is yet another approach to performance evaluation that may be used. Tracking the number of calls received by customer support, the amount of system crashes, and the number of software upgrades are all good ways to accomplish this goal (Jalal, 2017).
Post-implementation evaluations
After the technology has been released to the user community and is now being utilized by those individuals, post-implementation assessments have to be carried out. The performance of the technology will be evaluated throughout this stage of the process in order to determine whether or not there are any areas in which it may be improved. In order to determine how well a newly introduced piece of technology is operating, post-implementation evaluations, also known as PIEs, need to be carried out after the technology has been put into use. PIEs are essential to the success of a business since they give feedback on whether or not the newly implemented technology is living up to the organization's expectations (Jalal, 2017). PIEs can assist businesses in recognizing areas in which the newly implemented technology can be enhanced if they are carried out.
Change Management
Assessing the possible effects of a change is the first thing that has to be done when dealing with change management. This involves determining who will be impacted by the change, what that change will imply for them, and how they are most likely to react to it. After gaining an understanding of the possible effects that the change may have, a strategy may be devised to assist in the management of the transition. This may entail drafting new policies and procedures, offering assistance throughout the transition time, and providing training for individuals who may be impacted by the change (Aslam, 2010).
Format to Present Data Analysis Report
Tables, graphs, and charts are just some of the usual ways that the findings of data analysis can be presented after being formatted. Tables are frequently utilized for the display of raw data, whilst graphs and charts are frequently utilized for the depiction of trends and patterns within the data (Thabane, 2009). Because the format for presenting the findings of data analysis will differ based on the particular data that is being analyzed and the audience that the report is intended for, the format for reporting the results of data analysis will change. A report on data analysis should, in general, include an explanation that is both clear and succinct of the data being analyzed, the techniques that were used to analyze the data, and the outcomes of the study. In addition to this, it is essential to make certain that the report is clear and does not contain any complicated terminology.
Test Solution
There are several ways to evaluate a technological investment's performance. Benchmarking is one strategy. This entails contrasting the performance of the technology investment with that of other businesses operating in a related or same industry. Using a before-and-after strategy is another technique to evaluate performance. This entails evaluating the technological investment's performance both before and after it has been made. Data gathered from users, clients, or other stakeholders might be used for this (Aslam, 2010). Using a control group strategy is another option to evaluate performance. This entails evaluating the technology investment's performance for a group of users who have it and contrasting it with the performance of a group of users who do not have it (Thabane, 2009).
The team will need to put the answer to the test after creating the study strategy. The group will need to create test cases to do this. Test cases are particular situations created to evaluate the effectiveness of the technological investment (Thabane, 2009). The test cases should be created to assess how well the technology investment performed in relation to the precise goals listed in the evaluation framework. The team must create a test plan that details the precise actions that must be followed in order to carry out the test cases. The team must also create a data collecting strategy that specifies the precise information that must be gathered in order to evaluate the effectiveness of the technological investment.
Conclusion
In summary, using a measuring framework is a crucial component of any performance assessment. The framework offers a method for taking into consideration the many data sources and the numerous stakeholder opinions. The efficacy of a technological investment may be assessed using this information. The success of a firm also depends on post-implementation assessments. When dealing with change management, the first step that has to be taken is evaluating the potential implications of a change. In order to do this, it is necessary to identify the people who will be affected by the change, what it will mean for them, and how they will likely respond to it (Aslam, 2010).
References
Aslam S, Emmanuel P. (2010). Formulating a researchable question: A critical step for facilitating good clinical research. Indian J Sex Transm Dis. 31: 47–50
Jalal K. (2017). Software Infrastructure to Reduce the Cost and Time of Building Enterprise Software Applications: Practices and Case Studies. Retrieved from: https://www.researchgate.net/publication/322267534_Software_Infrastructure_to_Reduce_the_Cost_and_Time_of_Building_Enterprise_Software_Applications_Practices_and_Case_Studies
McShane, S., & Von Glinow, M. (2018). Organizational Behavior (McGraw-Hill. 8th ed.,) Boston, MA
Thabane L, Thomas T, Ye C, Paul J. (2009). Posing the research question: not so simple. 56:71-79
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.