If greater-than-human artificial general intelligence is invented without due caution, it is all but certain that the human species will be extinct in very short order.
I taught at a school in Cincinnati with a 0% graduation rate and that was also interesting so I updated from thinking school was beneficial for other people but not beneficial to me, to thinking school was beneficial for maybe some people around the middle – at least some of the better schools – but not beneficial for the vast majority of people, to then actually reading the literature on education and on intelligence and academic accomplishment and symbolic manipulation and concluding "no, school isn't good for anyone". There might be a few schools that are good for people, like there's Blair and there's Stuyvesant and these schools may actually teach people, but school can better be seen as a vaccination program against knowledge than a process for instilling knowledge in people, and of course when a vaccination program messes up, occasionally people get sick and die of the mumps or smallpox or whatever. And when school messes up occasionally people get sick and educated and they lose biological fitness. And in either case the people in charge revise the program and try to make sure that doesn't happen again, but in the case of school they also use that as part of their positive branding and you know maintain a not-very-plausible story about it being intended to cause that effect while also working hard to make sure that doesn't happen again.
I know it's tedious, but you aren't SL4 until you appreciate the depth of human irrationality. Yes rationality is there, but at a level that just barely shows up when measured with precise instruments. This is important with respect to non-human intelligences, because super-rationality, almost as much as superintelligence, is potentially overwhelming and because intelligence bootstraping moves a system towards rationality. Appreciate how far humans are from rational and you appreciate how utterly transformed, and essentially recreated, they would be by haphazard bootstrapping. Appreciate how formidable rationality is and you see why a highly rational infrahuman GAI would still be a massive existential threat.