Featured

Agile VS Architecture?

This is actually a trick question as contrary to common beliefs they do not exist at the same level in the IT framework. There is a common fallacy that enterprise architecture interferes with the fluidity of the continuous delivery/continuous improvement model. In a high performing IT environment, architects and delivery teams have defined the delineation between planning and delivery in such a way that the architect teams provide a strategic roadmap for products. As delivery teams receive requirements, those can be matched against the roadmap to ensure that designs are aligned with the overall strategic direction of the IT organization.

When properly implemented, an environment which incorporates an Agile methodology with a functional enterprise architecture can increase effectiveness as the architectural framework can reduce the amount of design that needs to be done within the sprint cycle. Concurrently, by following an architectural framework development teams will have a library of reusable assets to draw upon as they start their development cycle. On the IT service management side of this equation, operational and support teams also benefit as and increase in the amount of reused assets simplifies the training and knowledge needed to support the product portfolios.

Without an enterprise architecture, product teams are often left to their own resources. As the organization grows, this often leads to disjointed IT solutions across product teams… which in turn increases the complexity for operation and confusion for the customers and executives when attempting to describe the portfolios that are provided.

From the horse’s mouth

I went to ChatGPT and put in a simple ask: write a paragraph highlighting the benefits of AI

The response: Artificial intelligence (AI) offers transformative benefits across nearly every industry by improving efficiency, accuracy, and innovation. AI systems can analyze vast amounts of data in seconds, enabling better decision-making in fields like healthcare, finance, and education. In healthcare, for example, AI helps doctors detect diseases earlier and personalize treatment plans. Businesses use AI to automate repetitive tasks, reduce costs, and improve customer service through chatbots and predictive analytics. AI also enhances accessibility with tools like speech recognition and real-time translation, empowering people with disabilities and connecting global communities. By handling routine work and uncovering insights humans might miss, AI allows individuals and organizations to focus more on creativity, strategy, and meaningful problem-solving.

Next question: write a paragraph highlighting the risks of ai

The response: While artificial intelligence offers many advantages, it also presents significant risks that must be carefully managed. AI systems can reinforce bias if they are trained on flawed or unrepresentative data, leading to unfair outcomes in areas like hiring, lending, and law enforcement. Automation driven by AI may displace certain jobs, creating economic disruption and widening inequality if workers are not supported through retraining and education. There are also serious privacy concerns, as AI can process and analyze massive amounts of personal data. In addition, the misuse of AI for misinformation, deepfakes, cyberattacks, or autonomous weapons raises ethical and security challenges. Without thoughtful regulation, transparency, and human oversight, AI systems could make high-stakes decisions with limited accountability, potentially causing harm at scale.

The lesson here is that AI is nothing more than a tool, and event the tool recognizes that. Tools can be used for good or bad. Tools are not skilled, the user of the tool needs to be. The issue with AI is if the users become too dependent where they do not take the time to validate the data. While AI can be very helpful in creating the data faster, it still needs human oversight to weed out bias or hallucinations. While always true, this is even more important for less mature LLMs as they have a small collection of data points to base their inferences.

Happy Holidays

Whether you celebrate Christmas, Hanukkah, Kwanza or simply Festivus (for the rest of us) have a safe and happy holiday season and we’ll be back in the new year.

With so much happening in the AI universe this past year, we’ll have a lot to talk about in January.

In the meantime, we are curious to hear how you are using AI at work, school or home so feel free to send us your thoughts.

Chat-GPT… miracle or monster?

AI generated discussion has recently become a topic that has gone beyond the tech world, and is part of the headline news discussion. The debate has evolved around the value that AI can bring vs. the abuse that it enables.

As an IT professional who works in healthcare, I have seen the models that can do predictive analysis which means we can diagnose issues faster than a clinician might see the factors that combine to warn of an issue. At the same time, I daily see how that same technology is used to track my every move and fill my browser with suggestions based on what I’m looking at. It’s helpful at times, but it also throws me into pools I don’t belong in simply because I’m trying to understand what the latest splinter group is about. Can’t I ask who a group is without being labelled as one of them?

The most published concern over Chat-GPT is in high schools, colleges and universities over students using it to write papers. A tool that can write a complete paper by just asking for a topic creates a challenge for educators who will need to examine each submission that much closer. While it cannot totally copy the writing style of the student, it will require much greater scrutiny for the educator to determine if the work submitted does sound like the student’s voice.

On the positive side, I have found Chat-GPT to be very useful in creating PowerShell and Python scripts to simplify system management tasks. I would recommend keeping requests very generic to avoid inadverntant sharing of IP with the providers of the tool (if you don’t get that, make sure you read the agreement fine print very closely).

Facets of IT Architecture

What parts of IT need to be architected?

  • Process
  • Data
  • Application
  • Storage
  • Network
  • Platforms
  • Servers

I would say all of the above. But what do we mean by architect? The concept of architecture in IT follows that of civil engineering. It’s really quite simple, let’s refine our model on paper (metaphorically speaking) before we spend money on hardware.

If I had to highlight the biggest weakness I’ve encountered over the years, it was in process. Too many folks in IT cave to executive pressure to deliver fast. The problem with moving fast is that you don’t always have time to ask the necessary questions on requirements. And, I’ve found from experience that it takes 10 times longer to fix an issue than it would have taken to design for the requirement up front. The true goal of architecture isn’t to add red tape, it is to simplify delivery. By clarifying the goals up front, we can reduce the time and effort to deliver a functional design.

I intentionally put Data as #2 on the list as I have found that lack of understanding of the data model has been a common culprit in over-engineering the infrastructure. IT is generally the mechanism behind collecting, storing and delivering data at the most efficient means possible. A good data architect is vital to understanding the model, and from that being able to provide the best infrastructure to meet the business goals.

Next on the list is applications. This is often the black hole of IT as most applications have a difficult time providing a true architecture. So many times I’ve been given port requirements or TCP diagrams in response to requests for architecture. The reality is that the architectures need to define producers and consumers along with the methods in which data is communicated in order to produce a true picture of the work that needs to be supported.

The last 4 – storage, network, platforms and servers are the composite subgroup that make up IT infrastructure. In general, this is a collaboration of several teams typically directed by a CTO or an entrerprise architect. The goal of this group is to evaluate business forecasts to ensure that capacity and planning activities provide reasonable resources in advance of the business needs. If capacity is underforecast, then delays occur during projects as they become subject to availablity and supplier timelines. If overbuilt, then valuable budget dollars end up in a store room or sitting idle in the data center.

The overall purpose of incorporating architectural methodologies into your environment is to ensure that your organization can produced repeatable and efficient results in each project that comes. A critical output of architecture is asset harvesting. During the retrospective, the architect will look at any new processes, tools or artifacts were generated and add them to the library for future reference.

New Year’s Resolutions

It’s that time of year where folks make a lot of promises that usually falter before Groundhog Day. This year, I’ve started a notebook with the plan to make a weekly post. As you may see, I have a vested interest in process and methodologies… for a good reason since my 35 year IT career came more from how I do it than the tech that evolves every 18 months. However… emerging tech is just as important in planning an IT roadmap, so I’d love to hear what others see as important to their strategic directions.

Is Kaspersky Still a Thing?

On Friday, March 5th the FCC added Kaspersky to their list of restricted companies alongside two Chinese companies. In a statement regarding the move, Chair Jessica Rosenworcel commented that this “will help secure our networks from threats posed by Chinese and Russian state backed entities seeking to engage in espionage and otherwise harm America’s interests.”

The Kaspersky response accused the Federal Communications Commission of playing politics stating that this action was purely in response to the invasion of Ukraine, and not done with any basis in technology.

This wasn’t the first time that Kaspersky Labs were singled out by the United States Government. In December 2017, President Trump signed a law banning their antivirus software from federal agencies under suspicions that the popular software was being used in a cyber espionage collaboration with the Russian government.

In both instances, Kaspersky claims innocence and counter accuses the US of exerting political biases against their company. The latest action has been labelled as a retaliation against their Russia-based company over the actions of Putin and the Kremlin in Ukraine.

Whether this was politically-motivated or a true response to a real and present danger, what this geek finds amusing is that somehow Kaspersky is actually still in the marketplace (I didn’t realize they were).

What would ending Daylight Saving Time mean to the tech world?

This week, the U.S. Senate unanimous passed a bill to make Daylight Saving Time permanent for most of the United States. While this must still make it through the House and President Biden, it is very likely to pass thoses steps. There still may be some social discussions about very late sunrise times, especially for those with children who will be waiting for school buses in the dark during the winter months. However, the bigger impact may need to be planned for by the tech industry.

Many of us remember the scurry and panic over the anticipated Y2K bug. For those who do not, it was the unknown effects on computer systems and programs as we transitioned from 1999 to 2000 because of a common use of 2 digit years instead of 4 digit in many places. As many computer systems use a localized configuration setting to translate UTC strings into a local time, there may be a need for tech professionals to ensure that their systems are set to adapt to this change, if adopted.

Why do we even have DST in the first place? The practice began during World War I, and was originally implemented by Germany to support conservation of energy in the factories producing materials for the war. The United States, who had only just implemented time zones in the 1880’s adopted the practice soon after. After the war, the management of DST was turned over to the states until 1966 when it was again put into practice federally as a response to transportation safety issues.

In the world of computers, the issue of changing clocks has transformed into an automated practice that is managed at the server level based on a time zone setting in the operating system. The question will be how much time and effort will be needed to review all systems to ensure that a significant change can be safely adapted to all of the critical systems that run environments such as healthcare, finance and military defense. On the bright side, should a law establishing permanent Daylight Saving Time be passed there would be a window of opportunity for all responsible for data center operations or software maintenance to verify that the change can be safely made. Similarly, as the time zone change only occurs twice a year at an instant, the change could be implemented on servers at any time prior to the actual moment of implementation.

Should passage become inevitable, we as computer operations professionals should begin creating test plans to ensure that our systems can be updated efficiently and that all of our time sensitive application receive sufficient regression testing to ensure we do not experience any critical failures at the time when the clocks would normally have changed.

Concurrently, we need to advise our lawmakers of the amount of effort needed for implementation to ensure that an arbitrary date is chosen which does not allow time for that testing.