Resource Center

Back to blog homepage

What Makes Soluna’s Data Centers Efficient? A Recap of the Water Tower Research Fireside Chat Series

keywords

Data Centers, Investors, Renewable Computing

 

In February 2023, Water Tower Research analyst Graham Mattison interviewed Soluna CTO Dip Patel on the engineering and design of Soluna’s Modular Data Centers (MDCs).

Together, they discuss how Soluna’s MDCs can help support the growth of renewable energy on the electric grid and the opportunities and challenges of technology procurement in today’s marketplace.

The following is a transcript of their conversations edited for clarity and brevity.


Graham Mattison: Dip, thank you for joining us. I’ll start off with the first question.

What brought you to Soluna? Coming out of your last venture, you must have had a number of opportunities, what is it that attracted you to Soluna?

Dip Patel: It was part luck and part intersection of interest. I’ve always been into Bitcoin since it was launched and the mission of it was made clear. I thought it was very interesting. But I would say that the thing that attracted me most to it was climate impact. I left Lockheed to start Ecovent to focus on climate and this was a natural segue to that, but what ties it together is Soluna Computing CEO, John Belizaire.

When I was building Ecovent, John was a mentor of mine. When I sold it, he helped me through that process and when I knew he was starting a new company, he wrote me an email and the subject line was, “who do you know that knows about crypto?” I said to myself, whatever he’s into, I’m getting involved. That was the end of it. He wanted an introduction to a CTO and I said no. I won’t be introducing you to anybody until I get a chance to understand what you want. Then the rest is history.

What has been different about the industry and this opportunity than what you expected when you first joined?

Dip Patel: When the Internet started becoming more mainstream in the late 1980s and early 1990s, I was a kid. I got to see it as a spectator. I saw the birth of a new industry, these little communities, and incredible niche ways of the Internet being used. It started with bulletin boards and Usenet, ended up with the web, and now streaming. It’s now the backbone of the world.

Digital currency and crypto are the same way. It started with all these tiny beautiful little communities that were working on neat things and they were driven by high passion, but then industrialization happened. These things started getting bigger. What happened, just like with the Internet, is larger companies and larger institutions started coming in and that brought more discipline, more rigorous application of capital and growth, standardization and to bring some credibility to the whole game.

Now the issue with that is the Internet piece was driven by technologists. There was a bunch of nerds who started in the telecom industry and most initial big web companies were technical people and technical operators. But what I’ve noticed in this space is a lot of the industrialization of it is being led by capital first. Then the people that monitor that capital usually have backgrounds in capital or backgrounds in consulting, or backgrounds in business, and not as many with backgrounds in pure operations or pure technology.

What you’ve got is the business sense of it coming in very disciplined, which is driving amazing analytics, metrics, and standardization. But what it also means is the people who can operate, they’re going to need to be very good. That really makes me happy because one of the things that John and I pride ourselves on is that we’re operators. We want to get our hands dirty and get involved. I think that’s what’s happening. What you’re seeing during this downturn is the operators are the ones that are emerging. That’s going to be fantastic for the industry as a whole.

Graham Mattison: You probably also didn’t expect how long it takes to get anything done in the electric power world either.

Dip Patel: Everybody warned us. 

Graham Mattison: But you can’t imagine until you go through it yourself. You’re like, I can’t believe it’s taking so long. But this is an industry where anything other than 100% uptown is a complete failure, so they’re so cautious.

Dip Patel: Totally. When it comes to these kinds of loads, the grid doesn’t know how to evaluate them. We thought moving from Morocco to the United States-  because Soluna started with the Moroccan wind project- is going to go faster and it did. It’s still much faster because we’re in the country and we know how to do business here. But you’re right, there are just so many complications and moving parts to interacting with the grid at this scale, and you have to really understand it. When it’s a new business model and a new way of using energy, people are ultra-cautious, especially with what’s happening with the climate and the stressors that the grid is feeling right.

Graham Mattison: And to add to that, in Texas, you’re coming on the heels of that major ice storm and power outages in 2021.

Dip Patel: That’s right.

Can you talk about Soluna’s modular data centers, in terms of how you designed them and why you designed them that way? Then maybe talk about how they compare to other data centers that are out there.

Dip Patel: My background, as you brought up earlier, I come from building radar and wireless systems. In those systems, efficiency is everything. The more efficient you are, the farther you could see, the longer you could see, and so on and so forth.

When we came to this industry, we weren’t looking to build a data center. What we were looking to do was solve a problem. That problem is that the more green energy you integrate into the grid, the more asymmetrical power problems you create. When the wind is blowing, people are sleeping. Unless the Eagles keep winning Super Bowls and getting there, the energy use won’t catch up in the evening that the wind produces.

The concept of the data center was can we use computers to consume that unused energy? If we were going to do that, and Bitcoin was an obvious first choice, but it was never just Bitcoin. It’s moving into Artificial Intelligence (AI) and Machine Learning (ML). We looked at AI and ML machines as a whole. What we learned was there’s a whole bunch of computing that you can turn on and off and it’s okay. If we use that and marry it to a data center, all of a sudden, we can solve a lot of problems.

What makes that system the most efficient is what we started with. In order to do that, these things are remote, they have to be easy to construct, easy to operate, and they have to be very efficient because you’re basically trying to not consume water as you’re remote. You need to be ultra-efficient, otherwise, there’s too much heat in there.

When you do all of that, what you do is you design basically with a blank slate what looks like a radar system, but it’s a data center. A big piece of that is you’ll notice computer chips, they’re starting to operate at hotter and hotter temperatures. Even Equinix, one of the biggest data center companies that pride itself on latency, is starting to run its data centers hotter. That’s a huge seismic shift in thinking.

When you take all those things and you mash them all together—the need for asynchronous computing, the need to consume random power in the middle of nowhere, the need to be able to do this efficiently, and the fact that chips are hardening to less stringent requirements (all requirements that we dealt with in the military as we started deploying in the desert)—what you do is you create a data center from scratch that can do that. That’s what we designed. We threw away all the initial design assumptions that data centers have like you need to have multiple power sources. You need to have temperatures of 70 degrees going into your computer. We said, what if we didn’t do any of that and just add the requirements that you need? When we did that, we designed what you see here and the beauty of it is when you design a radar, a lot of it is done on the computer. It used to be pen and paper, now it’s simulation software. It’s the same model here. The simulation software that we used in radar 20 years ago is now accessible to people like us at Soluna. You can do all these crazy simulations.

We had real-world data centers that are operating. We used the real-world data to feed our simulators and we built a simulator the same way we did for radar for the data centers. That’s how we were able to fine-tune the entire design before we ever built one. Then we built one and we fine-tuned it even more.

What were some of the challenges that you ran into in terms of making the changes from when you first started to where you are now?

Dip Patel: One is that what you want to do when you’re doing something new is surround yourself with the best experts. The best team. The best people who can help you avoid unoriginal mistakes. You want to make original mistakes.

The problem with this is every time we met new people and we tried to explain this data center we were trying to build—we knew a lot of them from my background through my MBA program, my brother is in the data center space, and John obviously is a telecom and software maven so he has a lot of contacts—everybody we talked to basically was like ‘what are you doing? Come again?’ It took so long just to get people who were open-minded enough to recognize that something like this could actually work. Those are the people we created partnerships with.

I would say that just finding the people who had the intersection of this amazing base of knowledge that’s just impossible to simulate or learn quickly, but also the open-mindedness to change the way they’re thinking into maybe a new way is not hard, it just takes a longer period of time to find those people.

Luckily, I got to deal with that at Lockheed because we were shifting away from missile defense, where you hit a missile with another missile to electronic warfare, where you’re confusing the other missile. When you’re trying to convince the military that’s used to hitting things, hey, just confuse it, it requires a change of thought. John did this in insurance. Entrepreneurs do this all the time. It’s not that it was impossible to deal with. It was everywhere because the energy people didn’t want to see it and the data center people didn’t want to see it. Then you would layer on crypto on top of that and it adds a whole another layer of Venn diagramming.

Graham Mattison: Looking at flagship Project Dorothy, this would be probably your fourth set-up. The company is getting closer to Dorothy and there was some great news out earlier this week that you’re in the final stages with ERCOT.

How have you been approaching the technology for  Dorothy Project?

Dip Patel: The F-35 program in the military gets a lot of flak because it’s very expensive and it’s been going on for a while. But the thing is it was revolutionary in that what they did was called a concurrent test. They were building the flight manual for the plane, while they were figuring out how to produce it. That shaved decades off the time to market, so to speak, for this aircraft.

We had to do the same thing at Dorothy. We have this data center that we’re building that has to interact with the grid, interact with the wind farm, and interact with pricing signals and all kinds of things, and then turn on and off based on a set of requirements. There’s really no way to test that until you build it.

What we did was build a digital twin. We literally built an entire simulator of Dorothy. The switches, the power supplies, the power distribution units, the transformers, and the fans. All those things were characterized by real-world data captured in Kentucky. Then we built a digital twin and then we used the control algorithms that we’re going to use to turn on and off Dorothy, against that simulator. What that’s allowed us to do is and once we knew that worked, we started controlling the Kentucky site using the Texas data to prove that our algorithms all made sense. In Kentucky, the utility gave us about three months to make sure all our algorithms were working when we turned on the site before we were tied to that power purchase agreement.

In Texas, as soon as we turn it on, we have to adhere to the rules of the grid, so that code better work. First of all, the code is always designed to fail. If anything is out of whack, the site shuts down. That’s always good. But the beauty of it is we’ve been able to de-risk it component by component completely on the cloud. We’ve created this digital twin technology that we could use at other sites in the future as well.

What that allows us to do is a concurrent test. We can test that all our switches, our networks, the control algorithms, the APIs, everything is working on time and exactly right. We’re using real data. The data is available on what the grid is doing. We pull that and we put it into our twin. We make sure the twin does what it’s supposed to do. Then when it comes time to bring it online that’s when you just plug it into real miners and you make sure everything is working there.

Was it helpful having that data when you’re dealing with ERCOT in terms of getting them comfortable with your technology?

Dip Patel: We run a demo with these partners who come online and take it down. They actually see the site running, hear it, and then they hit a button and they can see it shut down and hear it shut down. Then they hit another button and see it turn on. That reaction really adds a lot of credibility and trust because these folks that operate the grid down there, they’re responsible for huge amounts of area and land that have customers that need power. Their livelihoods depend on it.

We want to respect that and the only way to do that is to make sure our tech works. If there’s a blind spot, we want to find it. That’s why this digital twin technology and having a real data center in Kentucky as well as having a partner in Texas, the ERCOT grid, and all our grid partners allow us to build this tech in a very stable way.

Graham Mattison: Shifting gears a little bit.

As you look at the technology that you’re putting into the data center or do you approach it differently whether or not you’re going to be hosting third-party miners versus generating Bitcoin or doing the computing for yourself?

Dip Patel: In many ways, no. But in many ways, yes. It’s almost like if you have an apartment. This analogy just came to me, so I hope it’s cool. But if you have an apartment, let’s just say it’s a two-floor multi-family apartment and you want to rent the bottom unit on Airbnb. There are certain protections you have to put in place for both the person renting it as well as you and that’s how you approach it. It’s almost like we are hosting from ourselves.

We have a set of requirements as hosting clients, as mining operators, and then we have a set of requirements as the technical people running the data center itself. They have to match. When we bring in a third party on that, what we need to do is add in that third set of requirements as well as the applicable firewalls.

If these customers want to come in and look at their miners alone, but they’re physically connected to the same network, you need to have all these appropriate controls and firewalls in place, auditing, and customer support. If our ops team doesn’t like it, they just hop onto Slack and say something. If a customer doesn’t like it, they must have a mechanism to say that. All those things are taken into account and we’ve architected our software in such a way that makes those types of things not easy, but straightforward to implement, but you have to be thoughtful.

Graham Mattison: Just a reminder to the audience that if anybody has any questions, those can be asked via the webcast.

Bitcoin has seen a great start to the year, but last year was certainly a challenge, and there were a number of bankruptcies in the industry. How has the marketplace shifted or evolved for technology out there? Have you changed your approach to how you look at it?

Dip Patel: For technology, the markets have shifted, but if you look at the hash rate, the number of miners out there, it hasn’t 100% correlated. When the markets are down, it’s not like everybody is unplugged. What that means is that more and more thought has been put into efficiency. There are a lot of amazing new software analytical tools that are being released like firmware for miners.

The technology itself, I would say that it evolves differently. When it’s a bull market, people are looking at how to get more hashrate out of their machines. How do they cool them faster? How do they just supercharge everything? When the market’s down, you might tune it to rather than pure horsepower out of each machine. You look at efficiency. Joules per terahash. How much energy does it take for a given unit of computing? You want to optimize for that ratio. Firmware might allow you to tune those things and decisions being made might allow you to tune that.

I’m not studying the markets. There are a lot of smarter people at Soluna that do that and explain it to me. I look more at the technology, the mining stack, and the efficiency stack of converting a watt to something useful. For us, what that’s done is given us more options on the software side.

The other piece that I mentioned is Equinix is raising the heat on their data centers to save, curtail power, and become more efficient. That’s being driven by chip makers allowing their chips to operate in hotter core temperatures. This drives into the efficiency talk. It drives how we adapt to chips to the world today. About 10 to 15 years ago, everything was about battery life. Battery life is getting there. It’s getting good. Displays are starting to become a big piece of that and refresh rates.

But when you look at ways you can drastically change efficiency in a model, maybe we run them a little hotter. All of a sudden you can have these machines consuming less power for cooling. The fans can slow down around it. The heat sinks can change There are a lot of trickle-down effects to that. I feel like the chips are starting to do that as well. That’s driven by mobile applications as well. Nobody wants fans in their pockets. It’s a killer. Removing heat and becoming efficient is not just driven by computing, but the markets as a whole.

Can you talk about the technical challenges you faced when working with the wind developers or utility or the grid? I think you hit on this a little bit, but if you could expand a bit on that how your data centers work with that?

Dip Patel: What’s awesome is almost all these folks have engineering departments that we always want access to because we want to learn from each other. They want to know how it works. They want to understand it because there are things like reactive power for those who don’t know. There’s real power and what they call imaginary. There are all kinds of stuff with electronics, especially with AC circuits, that create what they call reactive power. That’s a power that can change the overall efficiency of the grid, create heat, create mismatches and bounces, and things like that. The grid is very sensitive to understanding how these loads add to it.

Now wind farms add reactive power because they’re massive energy generators. But the point is they create this reactive power. There’s a lot of scrutiny into the kind of reactive power our data centers create because we consume a lot of power. If we consumed all that power to run turbines that would create a lot of reactive power. We run a lot of fans, but they’re all behind protections. We’ve architected our site to protect against reactive power. That’s just an example of something that a grid operator or an integrator is keen to understand. It’s not really something you can understand unless you get a really nuanced understanding of what’s happening in the data center.

Those types of challenges mean that we have to get good at communicating how the tech works and build that trust with our partners so that they trust it and are not going to freak out and create problems. These massive data center loads could really damage a grid if they’re not controlled correctly, especially when they’re turning on and off. Those two combinations, the fact that we’re turning on and off and the fact that we are so big, are why they enjoy these demos a lot because they understand them.

Graham Mattison: The flip side of that is the ability to turn off means that when there’s excess demand on the grid and they need power, you guys can ramp down quickly so that other consumers can keep their lights on.

Dip Patel: Exactly. That’s a big piece of value. It’s what we want to do. It’s exactly why we exist. There are a hundred other use cases for power that come before these data centers and that’s the beauty of the use case of generative AI or any of these new use cases that don’t need a 100% low latency uptime. They just need a lot of computing. You can now provide that computing in a way that catalyzes green energy.

Do you need different computer technology or miners as you transition from mining Bitcoin to doing AI or other types of things?

Dip Patel: It is totally different. The miners that we use are Bitcoin miners. They use single-purpose chips called ASICs. These chips are designed to do one thing and that’s the algorithm that mines Bitcoin. They’d be horrible doing anything else. If we want to go into AI/ML, what you want are computers that are optimized for those use cases. Those are usually video cards, GPUs, CPU cores, and RAM and storage. They balance based on the different use cases out there.

Now even more specialized chips are going to come out that do specialized kinds of AI, and specialized kinds of ML, especially with generative AI starting to take off. There are going to be custom chips. Now the beauty of it is our data center is architected to accept any kind of computer. That’s the beauty of it.

If we pull out the Bitcoin miners and put in a bunch of AI/ML rigs, GPU rigs, like the DGX Systems by Nvidia, the network, the cooling, the power architecture, all of it can quickly adapt to it, and the software behind the scenes will recognize it and change how it operates for that. That’s the magic of this data center. It’s really designed from the ground up to take advantage of the fact that chips are going to start changing fast and furious, and you need a home for them all that can accept it. Just like an Airbnb that’s good for a party or a family or just a company.

Graham Mattison: We’re coming up to the end of our time, so I’ve got one final question.

From a technology standpoint, what’s the biggest challenge that you face as an industry and also as a company going forward?

Dip Patel: Technology-wise, I would say that the challenge ties to what I had mentioned earlier, which is we are breaking a lot of cardinal rules in the data center design industry and in the power industry. When we do that, the technologies that we apply to do that, we have to make sure they’re designed to do that. When we apply the firmware to a piece of mining equipment or to a network switch, we have to make sure that piece of software is robust for turning on and off. That in and of itself is important because the software that we’re using, the hardware that we’re using that we’re not designing in-house, we have to make sure it’s okay for this application because we’re one of the first to do it. There’s not a lot of published information. That’s why we like to publish information as we get it.

The second piece is proving to folks that you can build a data center that doesn’t consume water and that can thrive in a highly remote environment and works as advertised. Just getting that social proof and Dorothy’s going to go a long way. The Kentucky project does a long way of doing it because we’re turning that site on and off every day.

The Dorothy site is going to take it a step further because now we’re interacting with the ERCOT grid, which is another step. We’re co-located with a wind farm that has a curtailment problem. It’s legitimately square in the mission of what we wanted to solve, which is wasted energy being consumed by a purpose-built data center that can consume and that can accept any kind of computing.

Graham Mattison: That’s always been the challenge in the electricity industry. Nobody wants the plants in their backyard, so you have to build the plants far away and bring the power to where the demand is. Soluna is actually bringing the demand to the site of generation. 

Dip Patel: Exactly.

Graham Mattison: Dip, thank you very much. I really appreciate your time. Thank you everyone for joining us.


About Dipul Patel
Chief Technology Officer, Soluna Computing

Dip Patel has served as the Chief Technology Officer of Soluna since 2018 having previously founded and sold Ecovent Systems, a climate control systems company. Earlier in his career, he worked at Lockheed Martin focusing on Ballistic Missile Defense programs and multiple advanced radar and electronic warfare systems. He received a B.S. from Drexel University, a master’s in Electric Engineering from the University of Pennsylvania and an MBA from the Massachusetts Institute of Technology where he serves as an Entrepreneur in Residence and Lecturer.

About Graham Mattison
Senior Research Analyst
ClimateTech & Sustainable Investing

Graham Mattison brings more than 20 years of experience in equity research, investor relations, and corporate operations, growth, and development. Graham was the Investor Relations Officer for two NASDAQ-listed companies where he led multiple equity raises as well as managed an activist investor campaign, M&A and corporate restructuring, and a NASDAQ delisting and relisting.

Previously, he was a Senior Equity Research Analyst, most recently at Lazard Capital Markets, covering the industrial and cleantech industries. He began his career in Southeast Asia as an Investment Analyst for Daiwa Securities. He was also co-founder of an online residential real estate start-up that developed a web-based auction platform. 

Graham received his BA in East Asian Studies with minors in Economics and History from Hobart College and his MBA in Finance with honors from the Thunderbird International Business School at Arizona State University. He is an Investor Relations Charter (IRC) holder from the National Investor Relations Institute.