Hi, thank you for your message. I have a couple of questions, hoping you can clarify things a bit better for me:
- The climate change point can be calculated by summing up Disaster Intensity and Global Temperature. Do we know how these points are calculated? From what I read in the thread, Global Temperature rises by 1 point for every 2000 units of CO2? Because right now in my game, there are 4000 units of CO2 emitted and the temp rose by 1.6. I remember one of the patches reduced the speed of global warming, and this discussion was posted in March so I am not sure if the numbers are still up to date. Still, not sure how the Disaster Intensity is calculated.
- On the 1st and 3rd page of the discussion, there are 2 people who brought this up:
CO2 emission required for temperature to increase depends on map size based on these numbers:
Duel map: 500 000 CO2 for 1 degree temperature increase
Tiny map: 1 000 000 CO2 for 1 degree temperature increase
Small map: 1 500 000 CO2 for 1 degree temperature increase
Normal map: 2 000 000 CO2 for 1 degree temperature increase
Large map: 2 500 000 CO2 for 1 degree temperature increase
Huge map: 3 000 000 CO2 for 1 degree temperature increase
What are these numbers? Cuz they seem too disproportionately big, not sure how CO2 units can reach hundreds of thousands or millions? Am I missing something?
- I do get the deforestation stats, definitely will include it.
Yes we did the analysis before the Patch, The analysis is done so we understand the key settings and from there any changes per patch cen just be extrapolated. I'll have another look at he figures but to answer your question, the key thing in my post I refer you to is
One more question: So in order to calculate phases of climate change, we sum the Global Temperature Rise point and the Disaster Intensity point. What do we know about the Disaster Intensity point and how is it calculated?
NP, disaster intensity point? Do you mean the the didaster intensity setting? In the above params it does discuss how more volcanoes are active with a higher setting and disasters get more intense with a higher setting (I did not include that xml) but there no longer seems a correlation between disaster intensity setting and speed of global warming. The settings above show the realism settings do not change the clomate change points, They used to, I have the old settings, but they are old.
With regard to Units there is a global parameter that seems to spell it out but I have not validated
So an ironclad will emit half a coal in emissions per turn
In this first thread you sent me, when you hover your mouse over the Climate Change phase bar, you will see the climate change points, which is the sum of Global Temp and Disaster Intensity. I am not sure how Disaster Intensity is calculated, because in my game, I always set to hyperreal setting yet the disaster intensity point, even when I am at Phase 3, is always 0. And I remember there is a patch which dissociated the climate setting and how fast climate change took place. My question is if you know how this disaster intensity point is calculated.
In all of my recent games, these disaster intensity points were always 0, not sure if Firaxis forgot to delete it after making it irrelevant to climate change phase, or if it bears any other meanings.
Yes, if you pop into Expansion2_randomn_events.xml and change the ClimateChangePoints=0 settig below to perhaps 1 or 2 and then start up the game and try you will find the disaster intensity value is now what you set it to. Give it a god to validate... I did test this when GS came out. Firxis make mistakes all over the place, for example the Toa and Digger should have resource requirements bu they failed to change them in the units.xml file despite having made notes in there to do so (I hop Toa do not require iron because they fought rifles with stone clubs and that is an awesome thng to keep in the game.