Silicon Battery Technology Breakthrough to Dramatically Increase Electric Vehicle Range

wow, it looks like an El Camino ... I'd gladly drive that one :cool:
Yeah look at where the wheels are though, you drive in front of the front wheels? There's no engine up front either, that would be hellish in a front end crash.
 
The F-35 HMD Helmet is a f'n mess right now. Especially considering each one is custom made for each pilot's head. But the truth of it's delay is mostly because like most gov't contracts, they are Waterfall in nature. As a result there are a ton of unknowns especially with parallel systems. This is the opposite of modern developer world where systems are largely self contained, small, well defined manageable targets with interfaces and dependency injection. It's these unknowns that are discovered later that bite you in the tail.
The concurrent development issues were well known before those programs were even started; it was a risk the government was willing to take. There have been discussions about it in the professional program/project management community for years and was a hot topic when the DoD decided to undertake DDX, CVN, and the F-35. Concurrent development increases the parallel dependancies, thus it's name is 100% self explanatory, but promises to deliver a more up to date system when acquisition takes many years. Blaming everything on waterfall is kinda ignorant. It's just a tool in the tool box, like a hammer. It's the appropriate tool for many jobs, but not all. I've seen agile programs succeed and fail. I've seen waterfall succeed and fail. I've seen hybrids and all sorts of "modern developer methods" succeed and fail. The success or failure hinges on choosing the correct program management method/tool for the appropriate tasks. Too many people learn one method, or just have a preference for it, and everything starts looking like a nail.

The underlined portion is really Systems Engineering 101...something which most people who call themselves "SEs" really are not and far too much is given towards program/project management vs. the technical analysis side of that job. Real, classic SEs are a rare breed and don't come cheap. Specifically, making sure the right thing is being designed and built, which requires doing that underlined bit really well. Real SE is program management method agnostic. It's a way of thinking and analyzing a system, which is different than classic reductionist engineering domain specific methods, such that "systems are largely self contained, small, well defined manageable targets with interfaces and dependency injection." Some program/project management methods have attempted to weave those SE ideas into the fabric of their process, but that's not a guarantee of success. Mature software engineering shops are the closest, but that largely has to do with the need to manage complexity (what SEs are primarily focused on, that is to say interactions between components) vs. managed design complications (domain engineering).

Just some thoughts. Gotta get some stuff done today.
 
The problem with silicon based batteries is the crystals would crack (silicon has a tendency to do this as it is hard but rigid) as they release energy. Once cracked they don't hold a charge as well.

They made a discovery years ago that combining silicon with carbon nano tubes solved this problem. But getting them to bind the carbon nano tubes on a large scale was difficult.

But in general theory by decreasing the size of the crystals lattice and binding it to a flexible semi conductor compound like carbon nano tubes you can bypass these issues.

Makes me wonder if they found a way to bind it to a graphene compound?
You'd think they'd mention it if the "secret material" was just graphene.

Although I think we've all heard enough hype-jobs about that to dismiss any breakthroughs based on it out of hand...
 
Get 30x 18650 batteries hook these in to 3 sticks in series (12 volts) and run 10 of these sticks in parallel. You will have between 18 amp hours and 50 amp hours your current sealed lead acid battery is only 5 amp hours. With 5k mah cells your talking 15 hours vs 15 minutes (real world expect about 8 to 10) I putting together a solar panel based system sim to a battery back up and in early tests using a 12 volt supply to charge the batteries i have gotten my laptop to run off it for around 5 hours before the ac dc inverted shuts off. This was with a 12 volt 1 amp supply charging it for about 12 hours. They have some charge when shipped 40% if i recall. So 12 amps on top of around 25 amp giving me 37 amps. or about 7 times a fully charged sealed lead acid battery. On a reg unplugged backup the same laptop gets about 1 hour. So it is giving close enough to what it should in theory to be with in a standard error rate. Out side factors or my power supply used to charge it may not be putting out a full amp.

Buying 18650 cells in bulk to bulk the battery pack will cost you just under 1 dollar per cell. So it is very cheap. You could harvest them from old laptop batteries as well how ever from one battery to another you will have lower or higher mah capacities and ware and tare on them etc.

LMAO, thanks
 
That's the downside to capitalism: Some of the best tech out their dies because no one wants to invest in the production due to the initial high costs.

Seriously?!? "Initial high cost" - such a breakthrough would be worth billions. Initial high costs would hardly be of any consequence if the tech was even remotely viable. Capitalism would be the ultimate driver to ensure such tech saw the light of day. Think of the long term profits!

Capitalism has pushed more advancement in the entirety of human history than any other socio-economic system. Income inequality? HA! Go back 300 years and go interview anyone in Europe.

Ugh. Instead of anti-capitalistic propaganda how about reading up on some actual history, eh?
 
The concurrent development issues were well known before those programs were even started; it was a risk the government was willing to take. There have been discussions about it in the professional program/project management community for years and was a hot topic when the DoD decided to undertake DDX, CVN, and the F-35. Concurrent development increases the parallel dependancies, thus it's name is 100% self explanatory, but promises to deliver a more up to date system when acquisition takes many years. Blaming everything on waterfall is kinda ignorant. It's just a tool in the tool box, like a hammer. It's the appropriate tool for many jobs, but not all. I've seen agile programs succeed and fail. I've seen waterfall succeed and fail. I've seen hybrids and all sorts of "modern developer methods" succeed and fail. The success or failure hinges on choosing the correct program management method/tool for the appropriate tasks. Too many people learn one method, or just have a preference for it, and everything starts looking like a nail.

The underlined portion is really Systems Engineering 101...something which most people who call themselves "SEs" really are not and far too much is given towards program/project management vs. the technical analysis side of that job. Real, classic SEs are a rare breed and don't come cheap. Specifically, making sure the right thing is being designed and built, which requires doing that underlined bit really well. Real SE is program management method agnostic. It's a way of thinking and analyzing a system, which is different than classic reductionist engineering domain specific methods, such that "systems are largely self contained, small, well defined manageable targets with interfaces and dependency injection." Some program/project management methods have attempted to weave those SE ideas into the fabric of their process, but that's not a guarantee of success. Mature software engineering shops are the closest, but that largely has to do with the need to manage complexity (what SEs are primarily focused on, that is to say interactions between components) vs. managed design complications (domain engineering).

Just some thoughts. Gotta get some stuff done today.

Statistically speaking waterfall on average has 60% failure to deliver on time rate. The average overshoot is like 55%. It is by far the worst development methodology and should be used sparingly in large concurrent parallel development. I understand there are a lot of unknowns, but it is these unknowns that bite people in the tail. If you choose waterfall then you are knowingly creating a risk for delivery.


And your argument about System Engineers is a weak one. With domain systems so large it is IMPOSSIBLE for any one man to lead with the technical knowledge necessary to solve every task and be architect of it. It's Teams of people. Now I'm not negating the need for System Architects and Engineers. They are very much needed and NOT negated by reductionist engineering development modules. But a System Architect is like a 5 star general. And he has a team of generals he knows well and their abilities and divides out work accordingly. System architects do more research and delegation of authority. If a System Architect thinks he has all the answers, he should learn humility and learn from those he works with. A System Engineer and Architect however gets to make the final call. That is his responsibility. And if he fails to deliver, then that is on him and product owners for failing to estimate proper deadlines.

In fact, I wrote an article on this very same thing: Knowing roles and responsibilities when forming large project teams on linked in. My next article is on waterfall pitfalls, and balancing ready for development versus quick delivery.

BTW: Modern architectures should not care what your code pieces are (low coupling) if you properly implement contracts and use dependency injection. And I'm not just a software engineer. I'm a hardware engineer as well.
 
OH LOOK. it's another breakthrough that will never see the light of day.

That might be true ten years ago but we have Elon Musk that might take Norway up on their breakthrough, then other automakers will follow. I hope Elon never dies.
 
No not new technology! It burns the troglodytes eyes!
Honestly I wish I remember the idiots who constantly shit on emerging battery tech because it wasn't were the previous technology was.
(EVERYONE EXPECTED IT TO BE BEHIND AT THE START FFS YOU ARE NOT SMART FOR REMINDING US OF NOTHING CONSEQUENTIAL) - Rant to all those ppl.

it's sensationalism journalism. it's trash TBH. It's someone using an article to try to get people to fund this when it will likely go nowhere.

if it's too good to be true, it probably is. And there's always a catch. I do want them to figure it out, But there's no magic solution to some of these problems. it's hard, painful work usually.

I've heard of so much vaporware growing up that i'm cynical at this point. Give me a proof of concept and the slightest chance of large scale manufacturing and THEN i'll get excited.
 
Seriously?!? "Initial high cost" - such a breakthrough would be worth billions. Initial high costs would hardly be of any consequence if the tech was even remotely viable. Capitalism would be the ultimate driver to ensure such tech saw the light of day. Think of the long term profits!

Capitalism has pushed more advancement in the entirety of human history than any other socio-economic system. Income inequality? HA! Go back 300 years and go interview anyone in Europe.

Ugh. Instead of anti-capitalistic propaganda how about reading up on some actual history, eh?

History is littered with technology that is superior loosing out to technology that is cheaper. Gas powered cars are a good example of this, but there are plenty of examples.

At the end of the day, cost and profit matters more then advancing superior technology.
 
That made me laugh. But when it comes to advanced technology the best the Chinese can do is copy what's already there.

Even their space program and aircraft carriers are built on well outdated tech. Their stealth fighter knock off they STOLE from us is vastly inferior.

They are industrious and decent at mid tech knock offs, but they are not a science based engineering power house.

Made me laugh when I was typing it. Probably would've made a good joke in a Naked Gun movie.
 
So all jokes aside, the real lesson here isn't finding something new and superior but also finding a way to match it to a existing fab and/or production processes. I get this, now we just need to somehow incorporate this into the plethora of engineering students out there. C'mon folks, step it up.
 
  • Like
Reactions: DocNo
like this
As a few of you have pointed out production is a major hurdle even if you have viable chemistry. This is what I do for a living and the production process is a complete fucking nightmare to implement in volume.

The 800 mile electric car packs are technically here... We just can't make them in volume (yet...)
 
I'm going to rewind us a bit and be very blunt since you seem to be having trouble with reading comprehension and have gone way off topic.

The original article was about a battery breakthrough which appears to be TRL 2/3ish. There were multiple comments inquiring about why we never see these breakthroughs in products. I made my OP to provide some insight into the life-cycle of technology development, TRLs, and the multitude of challenges associated with getting across the valley of death. I think that was relevant and I kept it at a high level because this isn't the correct place or audience to have a TRL framework throw down. I also provide some examples that I think a few of the people on this forum might find illustrative of the assertion I made concerning the difficulty of getting from TRL-5 to TRL-6. Specifically, all of the unknown system level requirements and challenges associated with using technology in the field as an integrated system and the dangers of attempting to integrate them too early. I provided DDX, CVN, and the F-35/JSF as examples. I figured that most readers would be familiar with DDXs original rail gun, the latest CVN electromagnetic launcher (in place of steam), and the multiple technologies on JSF (to include the heads up display you brought up). They were meant only as an illustration of the potential dangers of trying to integrate tech too soon, as it relates to the need to development technology in stages, not a treaties on all of the problems with those developments, contractor or government. This simply isn't the place or time for it (it's off topic) and it's already been done ad nauseam.

You came back with your original reply about the F-35 HUD, which while relevant to the failures of that project, is not relevant to the topic here. I indulged a bit.

Statistically speaking waterfall on average has 60% failure to deliver on time rate. The average overshoot is like 55%. It is by far the worst development methodology and should be used sparingly in large concurrent parallel development. I understand there are a lot of unknowns, but it is these unknowns that bite people in the tail. If you choose waterfall then you are knowingly creating a risk for delivery.

I made no assertion about any of this. I merely stated it's a tool and needs to be used wisely. I never said it was the best tool or that it's not misused extensively. The fact that you can't see that every method has risk, it's merely a matter of what risks and to what extent, is a serious maturity problem. The whole point is to choose a method that compensates for the risks your development naturally will incur. That takes an extensive amount of wisdom and flexibility to avoid treating everything like a nail. Keeping this relevant to this forum, I bet if we analyzed whole-system, PC replacement methods of the users here, they would decidedly be "waterfall" like. Given that sites like this help to significantly reduce the unknowns, combined with the extensive standardization and modularity of the platform, it's not a bad choice.


And your argument about System Engineers is a weak one. With domain systems so large it is IMPOSSIBLE for any one man to lead with the technical knowledge necessary to solve every task and be architect of it. It's Teams of people. Now I'm not negating the need for System Architects and Engineers. They are very much needed and NOT negated by reductionist engineering development modules. But a System Architect is like a 5 star general. And he has a team of generals he knows well and their abilities and divides out work accordingly. System architects do more research and delegation of authority. If a System Architect thinks he has all the answers, he should learn humility and learn from those he works with. A System Engineer and Architect however gets to make the final call. That is his responsibility. And if he fails to deliver, then that is on him and product owners for failing to estimate proper deadlines.

What are you talking about? I used the plural of SE, SEs, extensively in my previous post. You could read that as there are multiple SEs in the world or multiple on a program/project. Given the projects we're discussing, come on, no one is that dense to think that an entire discipline on projects of this scale are done by one person! Is there only on EE or ME too?

Since you entirely missed the point, I'll make it simpler. You were on about "systems are largely self contained, small, well defined manageable targets with interfaces and dependency injection." I merely pointed out that those functions, roles and responsibilities are the domain of Systems Engineering; though hardly complete. It's such a basic assertion that I mentioned it might as well be SE 101. Are you arguing with the rough definition of an SEs role? You've gone off the deep end arguing a straw man about needing teams...

System Architect is to SE as Power Systems is to EE. It's a domain specialization within the larger subset. Just like the team rant, I'm not sure what that has to do with the threads topic.

In fact, I wrote an article on this very same thing: Knowing roles and responsibilities when forming large project teams on linked in. My next article is on waterfall pitfalls, and balancing ready for development versus quick delivery.

Congrats. I sure whatever it is entirely upstages everything written in the journals on the topic. Sorry for the sarcasm, but you're really coming off as a one trick pony; no depth or understanding of conceptual frameworks. I'm not saying you need to have published in a journal to have an informed opinion, but you're coming off as far too rigid to be really have a solid grasp of the concepts...you have a hammer and everything is looking like a nail.

Perhaps you do have a good grasp and merely are having trouble communicating effectively?

BTW: Modern architectures should not care what your code pieces are (low coupling) if you properly implement contracts and use dependency injection. And I'm not just a software engineer. I'm a hardware engineer as well.
I'm not sure what you're on about again. If it's SE, it's suppose to be implementation agnostic, and yes, the idea is to manage and mitigate the risks associated with dependancies, interfaces, and relationships (among other things). If you got confused about my software comment, that's likely because you don't understand the history or fundamentals of the various model-based approaches used for the space race (e.g. Apollo flight code; which I must add was way ahead of it's time) and RDS-1000. Or the documentation centric quagmire. Or the resurgence of MBSE, etc. I'm sorry if you misunderstood that as making any comment about your domain expertise or lack there of in software or hardware. None of it was about you or me in any way.

Edit: this has degraded into irrelevance and madness...I've got better things to do at this point.
 
I wanted to toss out a quick apology to the rest of the forum for getting this off topic. It was not my intent. I have no interest in continuing that specific discussion as it's not relevant to the thread. Sorry.

In the hope that there may still be some edification, I offer the following analogy relevant to the forum.

A ridged adherence to a rule of thumb (which is culturally relevant, but inappropriate if you know the origins) is being a fan boy and usually denotes a lack of understanding. It betrays a limitation to following rules and guidelines vice real comprehension of the subject, it's nuances and edge cases. It's the equivalent of saying:

If you buy an AMD GPU over NVIDIA you're a moron. Or...
If you buy an Intel CPU over AMD you're a moron...
Or any combination there of.

The mature answer is, it depends...what do you want to do and why? When someone asks you to help them build a good PC, that's the mature, expert advise. You need to really understand their use cases, problems and constraints. There are valid reasons to purchase any of those three main brands, but it really depends on why. Being too ridged and saying you should always buy X makes you a fan boy of limited knowledge and experience.
 
Back
Top