Subramanyam N, the Head of Learning and Development shared his expertise in Learning and Development.
Learning and Development is already an important cog in the talent supply chain and talent transformation wheels of the organization. L&D leaders are a critical part of decision making whether to Build or Buy the talent. The current best practice is: Build – For consolidation, Leaders within the organization for cultural integrity Buy – For rapid change, and niche skills In the next 5 years, L&D will ensure the organizations are agile enough to seamlessly operate when niche has to be built, or inorganic leadership induction. Building agility to deal with uncertainties, and smart alignment to artificial intelligence/digital transformation will be the success criteria for the fraternity
It sounds real while a bitter pill to swallow! The blind spot of Johari Window. Learning organizations set learning journeys based on roles and competencies. By choice and design, if the learning paths are aligned with career paths, with a strong structure and accountability, the result is always continuous development across the organization. Smart organizations organically position L&D in their structure which drives L&D leaders to learn themselves. Some examples are: I always believe and propagate that “leaders must embark on the path of followership initially” L&D is no exception to this.
The integrity of Learning and Development has remained intact. The delivery modes are seeing a paradigm shift and while, we still have no conclusive evidence on any model. There are three levers to learning delivery – In-person Instructive, Virtual Instructive, and Self-Paced Learning. The pandemic forced the fraternity to go Virtual Instructive and use micro-learning nuggets. The two-year window cannot be a sample for drawing rends/conclusions.
The actual change is to facilitate hybrid working and the ripple effect on learning. Two important opportunities to explore: Help talent learn how to work in a Hybrid learning environment Some examples: Help talent learn how to learn in a Hybrid learning environment Some examples: While delivery methods will see a natural forced shift in the balance, attention to detail to what is said above is more critical for the overall well-being of the talent in any organization
All the parameters of measurement be it hiring, training, performance, promotions, rewards, and recognition, culminate towards the people goals of the organization. Across stages of these touchpoints, we will have 70% of the measurement criteria being common, and 20%-30% will be custom-made for specific touchpoints. This is a sustainable competency model with significant scope to stay agile and creative. For a Manager’s role for a specific project, the competencies could be Engineering Excellence, Customer Engagement, Emotional Intelligence, etc. These are the areas we measure while we hire and select associates, create performance metrics for a specific role, and even promote an associate (internal hiring) to that role. With this premise, the effectiveness is two-fold: We adopt a 30-60-90 days model of post-learning feedback timeline and seek feedback on the competency areas where gaps were observed from the manager and any other relevant stakeholders. This will help us measure training and learning effectiveness together. For some specific programs, a 360-degree program could be the starting point of the learning program, and the review point to measure training/learning effectiveness.
The learning paths are weaved in such a way, that the learner can choose to develop skills in the best method suiting their learning styles. Except for compliance, all other interventions have the flexibility to the learners. Implicitly 70-20-10 training delivery model is in place, where some associates seek on-the-job opportunities to scale up. Some choose Mentoring Programs Peer Learning or Social Learning to improve. Some choose self-paced hybrid learning methods. All of these will be in consultation with the L&D to ensure, the right learner has opted for the right approach to maximize the time invested in learning and development. Hence significant percentage of interventions are self-nominated or self-driven. Content strategy, by instructional design, first principles are applied to have a correct mix of Inductive, Deductive, Experimental, and Experiential learning. The decision-making process has two factors. The best method for a specific content or the best method for a specific content for the specific learner type/cohort.
Q1: In the next 5 years where do you see Learning and Development contributing more than before?
Q2: It is said, that while L&D Leaders are building learning journeys for the organization, sometimes, they forget about themselves. How do smart organizations address this?
Q3: How is the landscape of learning changing with the changing working styles and new operating procedures at Work, especially the delivery methods?
Q4: So where should L&D Focus?
Q5: Training effectiveness being the most critical parameter of any training program how must the L&D measure the effectiveness of training programs?
Q6: According to you how should the L&D department address the different learning styles and preferences of employees to create inclusive and engaging learning experiences?
Mastering Performance Engineering: Insights from an Expert
Yashwanth Dak Jain G, the Head of Performance Engineering and Security Testing, shared his expertise in performance engineering.
Q1: Could you provide insights into the diverse forms of performance testing, such as load, stress, and scalability testing, and elucidate their purposes?
Certainly! Performance engineering is designing, developing, testing, and optimizing a system or application to meet the desired performance goals and requirements.
We aim to optimize the utilization of resources by reducing waste and cost and project maximum efficiency. The system is capable of handling increased workloads without compromising performance. Our primary focus is consistent performance under all conditions and exceeding user expectations by providing a responsive and reliable experience.
Q2: In the context of application performance, how would you highlight potential bottlenecks within the system?
For starters, measuring response times and resource usage to pinpoint potential issues could help identify the bottlenecks. Assimilating and analyzing data from the application detects problems faster, and constant monitoring of the application’s code and design to find areas that need optimization has been in favor of us to avoid any blocks. We will dissect each non-performing area, using root cause analysis to pinpoint bottlenecks quickly.
Q3: What tools and techniques are most valuable for performance testing and monitoring?
Each ecosystem has tools suitable for it, and there are a plethora of options in the market. For performance testing and monitoring, we actively leverage JMeter for versatile load generation across ecosystems, alongside New Relic, SQL profiler, and JetBrains dotTrace for performance issue diagnosis.
Q4: What's the relationship between performance engineering and user experience (UX)?
Performance engineering and user experience (UX) are closely linked. They’re two sides of the same coin when building successful digital products.
At Excelsoft, we understand that performance significantly impacts UX, and UX influences performance engineering by setting performance requirements. Performance engineering ensures applications are fast, responsive, and efficient, eliminating frustrations and delays that detract from a positive user experience.
UX research helps identify user pain points and expectations around speed, responsiveness, and resource usage. This input guides performance engineers to prioritize improvements that make the most significant impact on user satisfaction.
Q5: Since the applications for both mobile and desktop are different, the performance engineering for the two should be other, too. What would be the fundamental differences?
Mobile apps try to use resources wisely because phones and tablets have limits, while web apps prioritize compatibility, interoperability, and scalability in browsers with various standards. Each type of app has its way of dealing with challenges based on where they run.
Q6: In your perspective, how does performance testing contribute to cost optimization in cloud deployments?
Performance testing plays a critical role in cost optimization and enhancing system efficiency. The insights gained during performance testing help by making efficient decisions in reducing resource consumption, optimizing scaling, and aiding in selecting the right cloud provider and planning for cost optimization. Identifying and addressing performance issues helps avoid costly problems like application crashes or slowdowns that might necessitate scaling up to larger, more expensive instances.
Our team optimizes using all investment components, ensuring the best possible outcomes.
Q7: What skills and knowledge are essential for a career in performance engineering?
You need to have your detective glasses on! But jokes apart, you must have a penchant to glean risks in complex systems. Also, you need to be well-versed in performance testing and monitoring tools like LoadRunner, JMeter, New Relic, Dynatrace, Perfmon, and various profilers. You must be ready for creative problem-solving and applying performance optimization tricks like caching, compression, minification, and parallelization to crank up that system performance. Communication is the key. Whether it’s a chat with developers, testers, managers, customers, or users or joining the performance engineers, your communication game needs to be strong.