How likely is an intelligence explosion?

If technological progress continues, it seems likely that there will be an intelligence explosion. At some point, AGI or even AI with advanced research ability will become better at building AI than its human developers, and could be used to improve its own design. This would create a feedback cycle of increasingly intelligent systems improving themselves more effectively. It is unclear what form this intelligence explosion will take, i.e. how fast intelligence increases and how high the limit is. Computers have many large advantages over biological cognition, so this scaling up might be very rapid if there is a computational overhang.

Some reasons progress in AI capabilities might stop before an intelligence explosion include global coordination to stop AI research, and global catastrophes severe enough to stop hardware production and maintenance.