With regard to calibration, sRGB and Rec709 have the exact same Gamut, with sRGB set to 2.2 Gamma, while Rec709 can be 2.2 to 2.4 Gamma.
The gamma is often identical though, as Rec709 is often 2.2 (as it has been historically, until BT1886 changed things, and then 2.4 became the underlying default - incorrectly, as a BT1886 calibration with a black that is above zero generates an overall gamma closer to 2.2 than 2.4...).
For image encoding they are different, but that is a very different story, and most sRGB material that is not computer generated is actually encoded to Rec709.
(To see the different encoding standards see: http://www.lightillusion.com/lut_manual.html#maths
- you may need to pre-load the page, and then use the link)
Also, Rec709 nor sRGB are locked to a specific nits level, as they are both 'Relative' standards, unlike ST2084 HDR which is an 'Absolute' standard, so a Rec709 or sRGB display with a peak luma of 400, 500, or 1even 1000 nits is perfectly valid.
But, ST2084 does 'suggest/specify' that the nominal diffuse white point (what would be peak while in Rec709.sRGB) is placed at 100 nits, within its absolute standard. That means the average picture brightness/level of ST2084 can easily be lower than a Rec709 display when using peak luma above 100 nits, as is very likely in normal home lounge situation.
The EBU spec for displays actually defines grade-1 to grade-3 display as being Rec709/BT1886 with peak luma ranging from 100 nits to 400 nits.
Just for info.