Zum Player springenZum Hauptinhalt springenZur Fußzeile springen
  • vorgestern
Codoflow for Startups: Reinventing Data Architecture for AI Success #ai #technology #machinelearning #dataanalytics

Ever wondered why so many AI projects fall flat—despite great teams and tools? It often comes down to one silent saboteur: your data architecture.

In this episode, we sit down with Stefan Opitz, Founder & COO of Codoflow, to explore how SMEs can build AI-ready infrastructure by rethinking data modeling from the ground up.



What You'll Learn:

* Why traditional top-down data modeling fails fast-growing companies

* How bottom-up modeling enables real-time transparency and integration

* Why data quality and ownership are critical to AI success

* How Codoflow brings real-time collaboration to data design—finally killing the spreadsheet

* What SMEs must do today to future-proof their data for AI, compliance, and scale


Guest Spotlight:

Stefan Opitz, COO and Co-Founder of Codoflow, brings decades of experience in IT consulting and integration projects. Frustrated with outdated frameworks and documentation, he built Codoflow to provide a pragmatic, design-first, collaborative data modeling platform tailored for mid-sized companies and tech-forward teams.

Emotional Close + CTA:

Loved this conversation? Follow Startuprad.io, subscribe on Spotify and Apple Podcasts, and drop us a ⭐⭐⭐⭐⭐ review. Share this with a founder, CIO, or investor who cares about making smarter, AI-driven decisions.

📚 Show Notes

Guest Name: Stefan Opitz, Founder & COO at Codoflow

Blog Post: https://www.startuprad.io/post/the-complete-guide-to-data-architecture-for-smes-and-ai-integration

Relevant Resources:

Codoflow Website (https://codoflow.io)


Why Bottom-Up Data Modeling Beats Traditional Frameworks for SMEs (http://startuprad.io/post/why-bottom-up-data-modeling-beats-traditional-frameworks-for-smes)


• How Poor Data Quality Undermines AI Training and Business Intelligence (http://startuprad.io/post/why-bottom-up-data-modeling-beats-traditional-frameworks-for-smes)

• Real-Time Collaboration in Data Architecture (http://startuprad.io/post/real-time-collaboration-in-data-architecture-a-new-standard-for-smes)

Timestamps:

00:00 – Introduction

03:20 – Why most data architecture fails SMEs

08:45 – Codoflow's bottom-up modeling approach

14:22 – Data quality, AI, and governance

21:00 – Real-time collaboration and change management

32:10 – The entrepreneur’s journey behind Codoflow

40:50 – Codoflow’s roadmap and AI integration vision

47:30 – Final insights and takeaways


#technology #innovation #ai #machinelearning #datascience #tech #software #webdevelopment #technology, #innovation, #AI #machinelearning, #datascience, #bigdata, #dataanalytics, #python #programming, #coding, #engineering, #software, #webdevelopment, #computerscience

Kategorie

🤖
Technik
Transkript
00:00Welcome to StartupRed.io, your podcast and YouTube blog covering the German startup scene
00:13with news, interviews and live events.
00:20Hello and welcome everybody.
00:22This is Joe from StartupRed.io, your startup podcast and YouTube blog from Germany.
00:27Today, I would like to welcome Stefan Opitz here, the founder and CEO of the Codaflow GmbH,
00:34which is a German form of LTD, a groundbreaking SaaS platform that is transforming data architecture
00:42management. With a rich background in IT consultancy and a deep understanding of the challenges in data
00:48integration, Stefan recognized the limitations of traditional top-down approaches to data modeling.
00:54This insight led to the creation of Codaflow, which adopts a bottom-up data modeling strategy,
01:00emphasizing granular data level design to automatically generate higher level architecture
01:06views. And this integrates automated system landscape generation, real-time collaboration,
01:14integrated change management and design first approach. Join us as we delve into Stefan's journey,
01:21exploring how Codaflow is enabling organizations to achieve transparent, collaborative and governed
01:28data model design, ensuring seamless integration and minimizing data risks.
01:34Huh, almost without a mistake. Stefan, welcome from Munich.
01:39Well, thank you for having me. And actually, I guess we are done now because we've said everything,
01:44so it's absolutely amazing.
01:46Yeah, we're done. Hey, guys, it was the shortest recording ever. Goodbye.
01:50Exactly. Glad being here.
01:52Yeah. Most efficient introduction ever. Check. Thank you to just me, my virtual assistant who made
01:58this possible. Thank you. That said, let us talk a little bit about who you are and what you did
02:07and how you went from consulting to founding Codaflow. And then we may already hint to our audience that
02:17data, data architecture, data architecture management is a pretty boring data, but it gets very important
02:24if you want to train an AI on your own data.
02:27For instance, exactly. Yeah. As you said, actually, I'm coming from a consulting background and I still run
02:37like a very specialized consulting company and background. And fun fact is that we actually were
02:45looking for a solution first. So it's not like that, that we came up with the, well, let's found Codaflow
02:51and let's do something completely different. It was rather the other way around. So we were looking
02:57for solutions in a data architecture area because we were facing certain challenges over and over
03:02again, because we're doing a lot of integration and we're chasing off the data like all the time.
03:08And yeah, at some point, well, we simply said, okay, if there's nothing out there, which is really
03:15satisfying us and everything is usually targeting enterprise level, we simply said, okay, well,
03:22let's then simply try to disrupt the world and let's make it a better place, at least from the point
03:29of the data architecture. Make the world a better place. Okay. Yeah. That's a pretty high goal. Yeah.
03:36Yeah, I know. Bold, bold claim. Okay. Great. You talked about you want to do that stuff yourself. Can
03:48you tell, can you give us a little introduction, just a tiny bit on why data architecture is important
03:58for the companies out there? Not the ones who are very tiny and running on one server, but the ones as
04:05soon as they start growing, especially growing very fast. I would even go a little bit further than
04:10that. Actually, I believe data architecture is important for everyone and it will become more
04:15important. But the main challenges we were actually facing is that whenever we came into a new client
04:22and we were asking for their, let's say, how is your setup running right now? And how is your
04:29integration working? From a technology point of view, they were actually rather, like that was rather a
04:34short discussion because then you just show a couple of slides and then that's it. But as soon
04:39you wanted to actually deep dive a little bit into the details of it, like what information is being
04:45exchanged? Where is that actually being forwarded to and all that? Then it was becoming a completely
04:51different discussion because then people would not be able to answer that and would actually just send
04:56you somewhere like just like basically, well, go away and ask someone who knows. And then the next
05:03question was like, well, who knows? And well, well, we don't know, but that person might know. And then
05:08they just like push you like basically from door to door. And that is, I believe, one of the main
05:14main challenges in general that our IT world becomes more and more a network of different systems. I mean,
05:24there are studies out there saying like, okay, well, in average, it's 112 systems. Others say like it's
05:29even 200 or even more. All of that information is being exchanged. And apparently, to my, let's say,
05:40experience, I have never met a company who has full control on all that. And I think that is
05:48one of the, let's say, rather infamous elements of data architecture because it's always outdated.
05:57No one knows. And it's even identifying it. It feels like if you're comparing like this foldable
06:04maps for streets to Google Maps, right? So it's a little bit like the same way. So
06:10if you're lucky, you have an outdated plan of the 1970s and you are trying to find your way.
06:16And personal experience is that
06:22in the IT of very large organizations, there are usually a handful of people who do have
06:27an idea of the general data flows around. But it starts to get really tricky if you dive into the
06:34details when you want to put something together, like a data lake or something. Where's the data
06:41coming from? From this system? No, it's coming from this system. No, it's coming from... Then it gets
06:45real, real tricky. When we discussed before, you told me there's a bottom-up approach. So what do you do?
06:52Do you give a vote to all the bits out there? Well, something like that. So,
07:01well, bottom-up approach is, and I guess we need to put that a little bit into a perspective.
07:09Common frameworks, common methodologies usually break things down. So you start like from a very high
07:15level and then you just like breaking it out into like smaller pieces. And that is something which
07:21absolutely makes sense in certain areas. And apparently it has been, let's say, the typical
07:27approach for data architecture as well. Let's say that is part of the problem from my point of view,
07:35because when we are talking about data, if you actually talk to a person like these experts you
07:42were just mentioning, right? So if you have such an expert, they usually have a deep knowledge on a
07:49particular system, on a particular data structure. It's very rare you find someone who knows all the
07:55systems with all their data elements in there. It's like, I have never met someone and I don't believe
08:01it is humanly possible even to get there. For everybody who's now out there smiling, you should
08:10once at least once try to understand those massive databases who only are out there to grasp data
08:19flows. It turns your head upside down within a few days and after month, you have not fully grasped it.
08:29Absolutely. And that's exactly where we are coming from. If you think about this one expert,
08:36the expert knows exactly what the data structure looks like. And quite honestly, data structure is
08:42compared to a business process, which is likely, well, which is important as well, but it's usually
08:50designed, right? So it is something, well, they're like BPMN 2.0 models and all that. When it comes to data,
08:58we actually can take the data from the data source. It's actually available. It's not a design question at
09:04that point. It's actually just like taking the reality and mapping it into some sort of model.
09:10So, and if you're doing that, and if everyone would do that, then you are starting at the bottom,
09:16right? So, and if you're having now all the details, it's from a technical point of view,
09:22it's rather simple than to aggregate the information. And let's say you have a system containing
09:2910 objects and each object has 10 attributes. And that information is being shared with a different
09:35system and going to some other objects and attributes. Then you can simply, well, state,
09:42well, there is information flowing from system A to system B. It's something which is technically
09:47rather trivial to actually, well, get as an output. And that's kind of like what we did. We said,
09:54okay, make it simple. And if you start from the lowest level of detail, then you can aggregate the
09:59information to whatever you like, right? But that is the magic behind the bottom-up approach.
10:08So, let's go a little bit into implementation and start to understand how this normally looks like,
10:14because I would assume you need at least one point where you actually connect to the system and get
10:25some access to all the data flows, and then you do an automated mapping. Is this something that happens,
10:32and how long does your whole process take? Fair question. And quite honestly,
10:39that is one of the most asked questions when we are talking to customers. Let's put it this way.
10:48We believe that a tool in itself is not worth anything, right? So, you need to have the right
10:53methodology behind it. Otherwise, you are just about the fool with the tool, right? And
10:59to actually get to a proper information. So, everyone has still this idea of, let's say,
11:07I want to automatically fetch information and send it somewhere, which translates for me into,
11:15oh, well, I want to give up the responsibility for my data quality and push it to some sort of API,
11:21to some sort of like automatism. And that is something which is contradicting to actually managing
11:27something. So, yes, there are possibilities to upload like information, like the data structure
11:36in that sense. We have focused actually a lot of our development just on that particular part.
11:44Nonetheless, it is part of the methodology to have someone who is actually responsible for the data
11:52architecture for one particular system, and even interface. Everyone should have actually a
11:57responsible person attached to it. It sounds weird, I know, because it's kind of like, well, you would
12:04expect that to happen, but reality is out there. Let's say there is a system, oh, we have just another
12:11one, okay, you already have two, now you have three, go, right? So, that's the sort of responsibility
12:20we see out there in the market, because quite honestly, it is nearly impossible for a company
12:27of a certain, let's say mid-size to have a dedicated person for each individual system. Nonetheless,
12:35we want to make it easy. So, and there are different ways on actually getting there. The simple path or
12:42the easy path is you just start with a couple of systems. You are just taking the information you know of,
12:51and you are just like putting it in, and you have a responsible person behind it. So, the implementation
12:57journey for that company starts essentially with two workshops, as easy as that. First one, we explain
13:04how it works, and the second one is we are discussing how the process, the change process actually works,
13:12because that is the most important part to keep the whole thing alive. Otherwise, you're just getting
13:17a snapshot. So, it needs work. It needs, well, taking responsibilities up on your desk and actually
13:26managing it from that moment forward. But the gains you're potentially getting out of that are by far
13:32outweigh that. We may add a little bit about the data quality, why this is important, because you can
13:40not have any good decisions without good data quality. For example, what comes to mind is especially
13:47if you don't have like the real structure and real data properly maintained in your system. For example,
13:55you cannot have any financial derivatives based on that, because there is like a big danger in that.
14:01And that basically works for a lot of other industries, a lot of other tools there as well,
14:10especially concerning pricing, especially concerning compliance and so on and so forth.
14:14And even pointing out the first point you mentioned in your introduction, everyone wants to use AI lately,
14:21right? That's like the new black, right? And you cannot use AI if you have proper data, because usually AI is,
14:34well, supposed to train on data. So, if you are not aware on your data quality and you don't know where it's coming from,
14:41well, how do you expect your AI or how can you even trust your AI model or your data model, which is coming out of that?
14:48For me, like there is a huge risk that, well, numbers are sort of like proving that it works, but do you know how long it will work?
14:59What if something changes? Like what if you need to train it on a different parameter? Where's the parameter coming from?
15:06Right. Yeah. But getting back to the implementation journey, like I was just like focusing on the easy part,
15:19right? So like, let's start simple. Of course, there's another way that you can actually go in and
15:25use some sort of consulting partner who is then dumping everything into the code of flow environment
15:34and background to helping you setting up every tiny little bit. But overall, the whole process is,
15:42or the tools we are providing, it should not take longer than 12 and let's be generous, like 24 weeks,
15:51because otherwise the data is outdated again. That's like what the majority of frameworks are
15:56actually suffering from because they are starting. And actually, we spoke to customers who are starting
16:02with like traditional tools in the data architecture environment. They start looking at the, well,
16:08documenting it. And in the end, it takes like two years and then they have a picture, which is two years old.
16:14I mean, how useful is that? It's simply, well, not going to fly. That's very simple.
16:23And we are trying to change that. So that's exactly where we are really putting a lot of effort in,
16:28and to make it as easy and pragmatic and simple as possible to get to a starting point. And from that
16:37starting point, and that is, well, allow me to just point that out, we are not trying to document
16:42anything. It is not supposed to be a documentation tool in that sense. It's supposed to be a design
16:49tool. And I think that is like one of the major changes towards other methodologies because we believe
16:57that it should start inside of a design process and not ending in a documentation process to actually
17:04gain control of it.
17:05Mm-hmm. You focus on a certain type called SMEs or in German, KAM, was there more than 3 million
17:17smaller, medium enterprises out there? I was wondering, how do you cater to the specific needs of this
17:29target group? Or why?
17:32Yeah. In general, data architecture in itself is usually part of something which is called
17:42enterprise architecture. It's a part of that. And that part is, let's say, very complex. And it is
17:49thought of, let's say, only for the enterprise companies. But enterprise companies is, well, maybe 25%.
17:58I don't know the exact number of the overall global industry. But the challenges of having multiple
18:07systems, and again, the numbers of 100 plus systems is an average overall industries. So it's true for
18:14the midsize companies as well. So it is actually a problem which is not reserved for enterprise companies.
18:22Actually, everyone has the problem. So it was time to actually provide a solution for
18:27everyone as well and not only the enterprise companies, right? So for that reason, we were
18:33focusing on like, okay, let's do it in a pragmatic way. Let's really try to, well, tear down the boundary
18:42of, well, you need to invest multiple millions to make it fly and wait for multiple years before you can
18:48actually start using it. So for that reason, we really try to come up with a way of working,
18:55which is doable or affordable in terms of effort for everyone and not just like companies who can
19:05create a different department, just focusing on that particular, well, element of the design process.
19:11Does that make sense?
19:13It does. We've been already talking about change management because change management is very important,
19:20especially if you do have new systems, because if you ever try to do to remove one system, just unplug it
19:27and plug the other one in that may work in a small network. But if you do have more than 20 employees,
19:34it will usually not work or at least as well as one would like to have it. So we need to do some change
19:43management. And how does CotoFlow facilitate this effective change management within the IT systems?
19:50Good. Very good question, actually. I think that's like the biggest differentiator between our solution
19:59and, well, let's see, other approaches out there in the market. So here as well, pragmatism first.
20:07There is no need of a complex solution for a complex problem. So you need to come up with a simple way of
20:15solving something. And here is something, well, which we did not invent, but simply reused.
20:23Any software development company is so used to actually use code repositories. Essentially data and its
20:31architecture in that sense is something like a code file. It is a structured way of describing how your
20:40data looks like and how it's actually built up inside of a model. So you can use that and use versioning
20:48behind that, right? So that's number one. So if you are doing versioning and you can ask anyone who has
20:56ever done like some sort of like, let's say multiple people trying to contribute to a software project,
21:03you want to have something what is called a pull request, right? So you want to have someone
21:08being the quality gate or the gatekeeper for anything which is going to move into that code repository.
21:14So we simply reuse that. So whenever you are trying to come up with a new release, and the release for
21:23us means that you have changes inside of your data architecture, the system is going to follow up and
21:29trying to find out like, okay, like where is that data being integrated to? Where is it being shared?
21:35So who do we need to ask for approval if you may do that or not?
21:40That's like literally like what I have not seen in the market so far because this is really implying
21:46that you are asking upfront before you're doing a change. And if you're thinking a little bit ahead of
21:53you don't need to necessarily do that at the very end, what usually happens like after testing,
21:59you can actually start asking that particular question in that tool in the design phase.
22:05So you know actually what is coming up and you can inform the people who should actually react up on
22:13that, right? If you're changing your field in a system, that might actually affect an interface and that
22:18interface might affect another system and so on and so on.
22:21And it might affect the process of how somebody does his or her work.
22:27Correct. Correct. So, and that's like the beauty of it because we would, well, suggest to start with
22:36that instead of waiting for everything to be done and then starting testing and like, oh, well,
22:42it fails. Why? Yeah, because there is not, well, this field is now not supported anymore in our
22:48interface because there's a field length of 10 and we have now 12. I don't know, right? Just coming up
22:53with something. But that's essentially, now you can actually say, well, okay, I want to change it to 12.
23:00Who do I need to actually ask? And the system will figure out the path of that particular element
23:07throughout your whole data landscape and will create a task for everyone and asking the question,
23:12okay, well, are you feeling okay with that change or do you need to react upon that? And then we are
23:19basically orchestrating that communication between the different system owners and sort of like pushing
23:27out the change before it actually happened.
23:30That sounds pretty interesting. We will be back with more stories about you and Codaflow after a short ad break.
23:45Hey guys, welcome back to my interview with Stefan Opitz of Codaflow. We will be now talking a little bit about
23:54real-time collaboration. In what way does Codaflow enable the real-time collaboration among the stakeholders in the company?
24:06That's actually a very, let's say, critical element of the whole solution. So again, coming back to our past,
24:15so we have been doing a lot of integration projects between different systems.
24:20I would say 10 out of 10 start with a sentence, let's open up an Excel spreadsheet and let's combine
24:28forces and mapping fields and objects.
24:32Well, in some cases, it might not be Excel, it might be a Google spreadsheet. It doesn't matter, right?
24:36But in the end, it's just a spreadsheet where everyone is trying to collaborate on.
24:43Main drawback, you don't have any data model information at that point.
24:47You just see, well, attributes. It doesn't tell you anything about cardinality or anything.
24:53The other point is, oh, I have seen it. Oh, let's filter it. Oh, let's re-sort it. Oh,
25:00unfortunately, I did not have all the different columns in my working area. And suddenly,
25:06the work is gone, right? So it has happened to me multiple times.
25:11So we said, okay, let's stop that. Let's actually put some sort of model behind it. So
25:20meaning that we created purposely our solution as a web solution, where everyone can collaborate in,
25:26like in a Miro board, for instance. So whatever you're doing, the other person sees at that very
25:32moment. And we have a split screen approach, meaning that you can have on one hand, your table with your
25:39attributes. On the other hand, you can actually see the data model or even the integration in itself.
25:46And you just like basically connect the dots as well, working, talking to the other person who is
25:52responsible for the other part. So it's basically, if you're part of the left side, you're connecting the
25:58middle part and the other part is connecting the middle part to their part. And this is like how
26:03we envisioned how we would like to work when we talk to someone, because usually we are in like the
26:10left part of that system, we are providing information. And we would like to simply say,
26:15okay, well, what do you need? This is what we have. See here, putting it in and saying, okay,
26:20let's now actually talk about like how this is actually being connected. So that's kind of like our
26:27approach. So we purposely didn't want to create another spreadsheet solution. We wanted to have to
26:35combine a system model, including the, let's say, Excel-like experience and background, plus putting
26:45some permissions on top of that, because we felt like that is something which is desperately needed in
26:52that regard. I have a question in terms of understanding. You do this once and the system then
27:00automatically updates itself? It does not update itself by definition, but remember I said we have
27:10the design first approach. So if we would actually go for the system to update it by itself, like using
27:17some APIs or reading SQL tables and background and all that, that would actually be really difficult
27:23to handle because we want to actually react before the change happens. So if we want to react upfront,
27:31then the approach is, well, we have the design first, meaning that you are planning what you want to do
27:36before it actually happened. So for that reason, you are starting at that point. And the second part is,
27:43you have a versioning model behind it. So if you are uploading something, well, to what version?
27:48To what release? Is it already approved? Do we merge? Who is deciding on that, right? So these are all the
27:55little elements which are, let's say, well, the consequence of having a proper change management behind
28:02that, but it allows you to really have an unknown quality of your data model, which has never been done
28:10before, because now you can actually trust the data, which is different from what we have seen so far.
28:17No one trusts documentation except their own, maybe, and maybe, really, maybe.
28:24It does make sense. We've been talking about AI training. Data transparency is here a big topic.
28:33How do you enhance this data transparency and governance? Because that is really something you
28:39should first have nailed before you even start training in an AI or an AI data.
28:44That's excellent. Well, actually, this is a perfect, well, next step because we were just talking about it.
28:52If we are talking about data trend transparency, it means to have a versioning behind it. It means that you
28:57need to have a trustworthy data source for that as well. Right now, the majority of companies,
29:05at least I have seen, well, at best they have some sort of like, well, data dump called Confluence or
29:13alike. Well, no pun intended here. It's just that someone is writing that because it has been the contract
29:22that you are providing some sort of technical documentation. So you're just writing it down,
29:29basically, well, dumping it somewhere. And I would even say the majority of these documents is,
29:37well, read by yourself and potentially one or two other people. And now you have different systems,
29:44and you have different contributors, and everyone is writing it in a little bit different way.
29:50In the end, you're getting, well, some sort of documentation. No arguing about that. But
29:57if you want to create a big picture out of that, it's rather impossible. And that's why we are saying,
30:04okay, let's bring everything in one spot and have a proper process behind it, have proper responsibilities
30:12behind it. And then suddenly, you have a transparent data model and a data model which you actually can
30:19rely on, which you can trust on because you have put the proper processes in place to support that.
30:28But that only works if everyone is contributing. Right? Otherwise, you again will end up in a,
30:33in a, well, situation where you don't really trust the data anymore. And that's the part.
30:43Could you talk about training AI? Could you also use your tool to investigate if something is going
30:50completely off the charts with the AI if it's not inherent in the model itself?
30:55Well, we don't touch the data in, in itself. So we don't know what the data is because we purposely
31:01designed our tools.
31:02I was, I was referring that some AI models are known to go off the charts. They, they are unstable. And
31:10some of them are performing good to really good. Uh, but they still can be, they still can be issues.
31:18And if you could use your tool to investigate if one of the issues may be in the data that you've had to
31:24it. At least we can, well, if we have that data model, uh, inside of our landscape, basically,
31:33at least we have a very clear understanding on where information is coming from. Right? So we can
31:39definitely plot the path of all the elements which are going in there. So if it is being fed by, I don't
31:45know, let's say, um, you have customer data and you have order data and both of them are being, being
31:52combined, but the customer data is coming from an online shop and the order data is coming from an ERP
31:58system, then there might be an inconsistency between these two, right? Because, uh, there might be a mix
32:04up of, of data sets and, uh, or not, not the right data quality. I don't know. So at least you know
32:10where to look at, right? So we don't know what data is being exchanged, but we know the data types
32:16or like the data sets in that, in that sense. So I would say, yes, it is definitely a good starting
32:23point to, to, for your investigation. We at least help you to identify the areas where you should look
32:30first. We are always interested in looking a little bit also into your entrepreneur's journey.
32:37Um, I would be interested in what obstacles did you encounter during the development and the launch
32:43of your Coderflow? Well, I would say, ask me what challenge I did not face. Uh, I think that would be
32:49a short answer. Um, I guess everyone has, um, has challenges, uh, here and there, and that's our, our daily
32:56life. But if I would really need to mark one or two challenges, which are outstanding, I would say
33:04the first one would be UI and UX design. Um, it is, as mentioned earlier, it is a complex problem.
33:15And we try to build a solution, which is so easy to use that the complex problem is actually more or
33:22less taken care of by the system itself. And you as a user just need to, well, focus on like providing
33:30the proper information and the system will guide you through the rest. Also, we are acknowledging that,
33:40let's say the majority of data architecture frameworks are relying on UML, unified modeling
33:48language, which is a, let's say, more software engineer approach, uh, of looking at, well, systems,
33:56data, whatsoever. And we purposely said, okay, well, we want to build something which is, well, readable for
34:03anyone, not just IT people or people with, uh, an education in that particular area, but actually
34:10even business users. So we really, really put a lot of thought into that. And, uh, it sounds easy,
34:17but it was really, let's say it, it was not just one iteration. It was multiple. And I would say that
34:24was our biggest, biggest achievement to finally overcome with that. Not claiming that we will not
34:31learn along our path and even like, um, well, readjust it again, because it is, that's how life
34:39works, right? You continuously learn. Well, it should work that way. If you stop learning, there's something
34:45wrong. Um, and when you're done with it, how was the market response to code of flow, um, in the beginning
34:53and since it's launched? Oh, that's, uh, that's a good one. So let's say, well, or you mentioned earlier,
35:05data architecture is not the, not the most prominent topic in the world, right? So right now people would
35:11rather talk about AI. Nice way to say that. Yeah. Well, trying to be, to be nice here, right? So, and
35:19it's kind of like, if you are getting in front of customers, we need to explain the problem first.
35:28And then we need to explain that it is really becoming, um, a problem at some point if they don't
35:35do anything about it. And that is independent if they are using our system or not, it will just hit
35:41them. And there are studies showing that as well, right? So data architecture is becoming one of,
35:46one of the, the, let's say prerequisites, uh, to be successful in the near future.
35:54So I would say that that was like the, let's say more challenging part when we are getting in front of
36:01them, but funny enough, I have not had one single person not confirming that the problem,
36:11the symptoms of not managing your data architecture. And there are like many of them,
36:16like outdated documentation is just one of them. But just like trying to answer the question,
36:21where is the data set coming from? I have, let's say the majority of people cannot really answer that,
36:27at least not without putting effort into that. And that was the part where everyone was like,
36:34a little bit like, oh, okay, is there really a solution? And, and it's more like they cannot
36:41really believe it. And then we start actually explaining how our approach is. And then you can
36:46feel like, well, if it would be that easy, why hasn't someone done it yet? Right? So that's,
36:53that's, it's, it's really like a weird discussion and you really need to kind of like juggle
36:57like the, like the two sides and not making it too easy. And on the other hand, like trying to
37:02point out like, well, that it is really a problem. I see. I'm talking
37:11about the future here. We already discussed importance of code of flow on data architecture,
37:19data management in particular when training AI. What impact do you foresee will AI have? And what's
37:29your future plans for growth and evolution in the next few years?
37:33Yeah. Well, AI actually one, one part of our, of our roadmap is to expose, well, the model context
37:47protocol, which is basically allowing language models to use our system so that it becomes even,
37:54even easier to, to, to use. And you can actually ask questions about the data model. So instead of like
38:01asking an IT person, where's the data coming from, you can simply ask the language model and then it
38:05will tell you that's like one part of our, our roadmap, which is kind of like answering the, the AI
38:11question as well, right? Because AI will become more and more part of our world. Like what I personally
38:18tend, tend to say it, it will make good people be twice or three times as efficient as they are,
38:27if they are knowing the right questions to ask and if the data quality is right. And the, well,
38:33ensuring the data quality is right is exactly what we are targeting to do. On the other hand,
38:41well, another big milestone, which we are trying to achieve by the end of this year is what we would
38:48call a so-called deployment project. So I was just like pointing out that we will ask, or we will send
38:54out these approval requests when you are trying to approve a release for, for a data change or data set
39:01change. That might actually have implications, right? So if you're sending it out and you're asking 10
39:09people and suddenly that might actually appear, well, okay, I might not need to change my direct
39:13interface, but there's like three systems further down the integration stream. There's actually an,
39:20like an interface you need to change as well. And now you need to manage that further, or you need to
39:27actually make sure that your changes are going online in the production environment with that change as
39:35well, which essentially is an orchestration challenge. So we put now in place this, or we are planning to put in
39:44place this deployment project, which is going to allow you to keep track on what changes are implied by
39:54your change and who else need to change that and start the design journey on their end as well. And then
40:01making sure that everything is like flowing together and well, kind of like a program management behind
40:08that so that you are then making sure that all the necessary changes are going online in the production
40:14environment at the same time. And that is something I believe no one has ever done before, because this
40:22is really meaning that you need to have an understanding between different environments as well, which we do
40:29from a data model point of view and background as well. So we are differentiating it between a
40:34development system versus a production system. Majority of development systems are not even integrated
40:40anywhere because you are just developing there. So if you are changing something in there, you do not need
40:48to check that against the development environment. No, you need to check that against the production
40:52environment, because that's kind of like where you are trying to push it into. And then suddenly you
40:57have like the real big picture behind that and you can start planning, I don't know, ERP upgrades, for instance.
41:05It's usually like, well, let's say a challenge to say the least. And now you have a completely
41:13different approach on, well, mitigating that risk of failing in the integration tests, which is usually
41:23come at the very end, because that's usually the part which is to develop the last. Does that answer the question?
41:29Mm-hmm. Yes, it does. And it leads me to, so to say, the final question. What advice for other founders
41:42would you offer to them? What key advice before or during venturing into SaaS space?
41:52Well, obviously, don't do data architecture. Of course, we are there. No. Jokes aside, and I guess I
42:04said it before, if you're trying to really change the world, try to find a complex problem which you
42:12can solve in the most easy way you can possibly think of. It is the time for doing things easily. It's not
42:22the time of doing things complex. And I think that is like where a lot of people are having the, let's
42:30say, they have the right idea, but they are trying to overcomplicate it, over-engineer it. And I think
42:36right now is the time to actually make it as simple as possible so that you are capable of adopting it
42:44easily. Because a solution which is not adopted is not going to help anyone.
42:49Mm-hmm. Very smart words. Stefan, it was a pleasure talking to you. Thank you very much.
42:58Of course. For our audience, I would be interested, looking back, is there anything you would have done
43:06differently in your entrepreneurial journey? Stefan, thank you very much again. It was a pleasure having
43:11you as guests and we will be back for our premium subscribers in the Founders World with a few
43:18more in-depth questions. Thanks, guys. That's all, folks. Find more news, streams, events,
43:31and interviews at www.startuprad.io. Remember, sharing is caring.

Empfohlen