Quoting from Wikipedia,
The bolometric magnitude Mbol, takes into account electromagnetic
radiation at all wavelengths. It includes those unobserved due to
instrumental pass-band, the Earth's atmospheric absorption, and
extinction by interstellar dust. It is defined based on the luminosity
of the stars. In the case of stars with few observations, it must be
computed assuming an effective temperature. Classically, the
difference in bolometric magnitude is related to the luminosity ratio
according to:
$ M_{bol,*} - M_{bol,sun} = -2.5log_{10}(\frac{L_*}{L_{sun}})$
In August 2015, the International Astronomical Union passed Resolution
B2[7] defining the zero points of the absolute and apparent bolometric
magnitude scales in SI units for power (watts) and irradiance (W/m2),
respectively. Although bolometric magnitudes had been used by
astronomers for many decades, there had been systematic differences in
the absolute magnitude-luminosity scales presented in various
astronomical references, and no international standardization. This
led to systematic differences in bolometric corrections scales, which
when combined with incorrect assumed absolute bolometric magnitudes
for the Sun could lead to systematic errors in estimated stellar
luminosities (and stellar properties calculated which rely on stellar
luminosity, such as radii, ages, and so on).
[leading to the accepted definition of]
$ M_{bol} = -2.5log_{10}(L_*) + 71.1974... $ , where the constant term is the zero-point luminosity $L_0$ .
Dunno if this helps, other than that you have to determine the spectral luminosity of the star in question.