BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

The Transformative Power Of AI, Automation, And Big Data Analytics In Civilian And Defense Agencies

Following

Government agencies are increasingly adopting transformative and cutting-edge technologies to stay relevant. These technologies are not just enhancing operational efficiency but also elevating service quality, optimizing resource allocation, and fortifying national security. It’s no surprise that the landscape of AI, automation, and Big Data analytics has taken center stage in reshaping the operations of three pivotal agencies: CHIPS Program Office, General Services Administration (GSA), and the US Coast Guard.

Participating in a recent panel at the GovFuture Forum event in DC in August 2023, three government officials and leaders shared a broad range of perspectives on how AI, automation, and data are reshaping the future of government. They discuss challenges when it comes to AI adoption in government, how Federal Gov agencies address concerns regarding data privacy and security, as well as how the CHIPS Act program supports research and development initiatives to foster technological advancements and maintain competitiveness in the global market.

Impacts of AI and Automation at the General Services Administration (GSA)

The General Services Administration (GSA) is a cornerstone of government operations and procurement. They have experienced a profound transformation due to AI, automation, and Big Data analytics. Gopa Nair who is Innovation Adoption lead at GSA TTS Center of Excellence shares challenges the Federal Government can face when it comes to AI adoption in government.

Gopa shares, “What we see is a lot of people can create models and a lot of models are already existing out there. The key is how do you adopt it and scale across the government? I want to share three things based on experience. The number one, identify the right use case with the right technology. If you had one hour to solve a problem, you would take 55 minutes to understand what it is and five minutes to solve. Similarly, in any AI project, please take time to understand what is the right use case and what is the right technology? When I say right technology, being in the intelligent automation space, we span from RPA to AI. So we have the opportunity to see all these different types of tools.”

“But at the same time,” he continues, “there's danger that when you have a hammer, whatever appears is a nail. For example, because you have an RPA tool, every problem becomes an RPA solution. Or you have an AI model, every problem becomes an AI solution. So that is where what we are trying to say here is, identify the right use case and spend some time analyzing what is the right tool for this. So that is the first step we spend a lot of time on because sometimes, as you know, you can just do a Python code and create an automation. You don't need an RPA thing. But at the same time, there will be situations where you need an RPA, which will be nice to have an RPA (solution) rather than go for a complex AI model. So that analysis is the fundamental and most important one in this whole space of what the challenges we've seen adoption on.”

Gopa continues, “The second one: Data is the key. And having the right data, first of all, second one, the quality of data. And the way we kind of mitigate this is before we start a project, we have an intake process and we have created a checklist so that we can understand, oh, this problem is a good solution for this model. The model is there, the problem is there. You have the data. Oh, we have data. Yes, here we go. And we have a situation where we saw we got data in images, but the image quality was not good for the model. So those types of things to make sure that the data quality and of course, as you all know that for any AI, you need the quality too. Sometimes upfront checking all this will solve a lot of headaches down the line.”

Finally, Gopa concludes, “Number three, the most important and critical thing is to get alignment in this. What I mean by this is when data folks create a solution and want to implement it or onboard it in an infrastructure site, you should have the buy-in from the IT department or, for example, in this case, maybe a small business office. Now there should be coordination and facilitation among all the stakeholders. The way we do it, we bring the CDO shop, the CDO's, the CISOs, as well as the cloud folks, everybody together. And facilitate that meeting widely so that everybody's onboard. That stakeholder alignment, making sure what's the data platform because a lot of senior (leadership) have data platforms on their own. So making sure that the right platform, the buy in, and now if you have something that cloud people should be allowing you to put it on cloud. So those are some of the practical things you should be upfront thinking about and making sure whoever are the stakeholders involved, they are all on the table so that you can have a discussion and make sure they're aligned. And that is a key part.”

CHIPS for America (CHIPS.gov) advancing Hardware for AI

The global pandemic brought to light the delicate supply chain, and just how critical certain components truly are. While the United States remains a global leader in semiconductor design and research and development, it has fallen behind in manufacturing. US production now only accounts for about 10 percent of global commercial production of semiconductors. The CHIPS and Science Act of 2022 provides the Department of Commerce with $50 billion for a suite of programs to strengthen and revitalize the U.S. position in semiconductor research, development, and manufacturing.

Ayodele Okeowo, Director of Intergovernmental Affairs at the CHIPS Program Office shared his insights into how the CHIPS program supports research and development initiatives to foster technological advancements and maintain US competitiveness in the global market.

Ayodele states, “when we look at each CHIPS program office and the $52 billion that we're responsible for administering, $39 billion of that is for manufacturing incentives that are going to industry into FAB projects and supply chain projects around the country. But $11 billion of that is allocated for research and development efforts. And I'm going to speak about one piece about that research and development program. Like I said, it's a pot of money, $11 billion, to ensure that we maintain and strengthen our leadership on innovation and research and development. We don't necessarily produce the chips anymore, but we still design them, the companies, the mines, not just for semiconductors, but for emerging technologies for critical material sciences for quantum computing, those are still here domestically, and we want to continue to foster that kind of innovation.

Ayodele continues, “and so the four components of this research development program are the national advanced packaging manufacturing program, as the chips get smaller we try to keep up with Moore's law. We have to innovate and start to stack in a 3D sense rather than trying to continue to shrink and shrink to the nano atomic level. We're also going to try to stand up to three manufacturing institutes. We have about 17 of them across the country right now. These are distributed industry, public, private. We want to build out three more of those. We also want to continue to innovate with metrology and the measurement sciences, which are key to again, trying to keep up with Moore's law and these technologies. And then what we would consider as the focal point is the National Assembly Connector Technology Center. This is going to be a public, private consortium that's very ambitious.”

Modernization Efforts at US Coast Guard

The US Coast Guard, entrusted with safeguarding the nation's maritime interests, has harnessed AI, automation, and Big Data analytics to excel in its mission. Captain Patrick M. Thompson currently serves as the Director of Infrastructure for the United States Coast Guard C5I Service Center leading all Information Technology, hardware, networks, and cloud technologies. Captain Thompson shares how the US Coast Guard is addressing concerns regarding data privacy and security in the era of increased data collection and utilization.

He shares that “one of the biggest problems we have in modernizing is this lack of being able to buy hardware. Almost every time I do a procurement now for any type of IT systems, the delays are crazy. I mean, sometimes nine months would be fast. Obviously I'm buying major amounts of hardware at a large scale, but it's interesting since COVID, it’s had a huge impact on our ability to modernize.”

Captain Thompson goes on to explain, “One of the things I noticed is that the government has all the tools to secure their enterprise, right? But just like industry, we make mistakes all the time. The question is: why? Why are things misconfigured? Why is this happening? Why are machines not getting patched? Why is this happening?”

“A lot of the time when I went in and tried to analyze and ask these and the after action reports, the 10 levels of “but why”, but why and they're like, oh, you know, oh, we got hacked. But why? We got hacked because it was a machine that wasn't patched. Okay, but why wasn't it patched?And it was like, well, you know, the technician didn't patch it. I'm like, okay, well, why did the technician not? You know, a lot of times I figured out sometimes, you know, we have a few different reasons.”

Captain Thompson continues, “One, at a high level, when you look at the sea of contracts that the government has to deliver anything, you end up with multiple vendors who are responsible and, why did that happen, right? Well, we had a vision of, we wanted to deliver this service so we hired something to do the service. We went out with an RRP and they came in, they delivered the service. Over time, what I've seen is that everything you deliver in the government requires a huge amount of integration across different areas. And you really can't have too many different hands in the pot, right? You have to have a single vendor who's responsible to deliver that service.”

He further explains, “When you're doing that with on-prem services, it just gets hard and complicated. And I think one of the ways that we can improve that is to move to the cloud, move to integrated models, and so instead of having to buy a physical server and wait nine months for that thing to get delivered and installed, you can just spin up a virtual server. If you don't need something, you can turn it down. That flexibility of being able to deliver things quickly and at scale to meet your needs is very valuable. Being able to have a trusted single entity that ensures that these things are configured and patched is super important. Having a single dashboard where you can see that those things are getting done, leveraging the automated tools that exist in a lot of these cloud environments without having to pay more and install some service that takes months to deliver it and configure it, just using the ones that are already there, that has a huge impact on delivering things securely.”

“So if you look at a lot of our internal plans and how we're going to get to AI, data analytics, a lot of them are based on like a cloud-first model is that we need to get these legacy tools and systems we have into the cloud, get them modernized, get them on modern database backends and start to leverage more SaaS solutions for data analytics platforms instead of installing stuff on-prem ourselves. I think at the beginning there was a thought, well, maybe there's a couple things that we can move to the cloud. And then, I think the mindset is starting to shift from instead of a couple of things that we can move to cloud, like,what has to say on-prem and then why does it have to say on-prem? And I think that answers, especially after we moved to Office 365 and people are realizing all of my data is in the cloud and the value proposition of being able to work remotely, being able to have your data like sitting on your iPhone, an iPad, with you 24/7.”

“We had all sorts of issues with email going down and problems we had in the past of making those things work. And, you know, those problems have kind of diminished and gone away. I think the other thing that we're seeing is when you go to more SaaS solutions the vendors can quickly iterate, like every two weeks you get a little bit different Improvement. Some people don't even notice those improvements, right? Like, if you used Office 365 for the last few years, you might not have noticed the little micro improvements that they've been adding to the service. And, I didn't have to go out and train my users on what they were and they just, they kind of evolved, right? And you can do that when you're in a cloud service in a much faster way than the government could ever do it on-prem.”

Additional insights were shared on the panel at the GovFuture Forum event in DC in August 2023 around the use and importance of data, automation and AI.

Disclosure: Kathleen Walch is an Executive Director at GovFuture.

Follow me on TwitterCheck out my website