Too many unknowns here. Are 50W and 100W the actual powers the bulbs are dissipating, or is that what they would dissipate if they were put across 240V (ie normal light bulbs)?
The other thing here is that light bulbs have a cold resistance which is significantly lower than the hot resistance. So if you put two in series like this the current will be limited by the 50W bulb and the 100W bulb may never get to its proper operating temperature and will remain colder and lower resistance. SO you might find that the 50W bulb has (say) 200V across it (close to normal) and the 100W bulb will only have 40V across it and may be only just lit.
You could do an experiment with a 5W and a 10W 12V bulb in series across 12V which would be a safe and easy way to verify what would happen.
FYI Power = V x I so I = P / V , and R = V / I so:
a 50W 240V bulb will have 50/240 = 0.208A flowing through it so R = 240 / 0.208 = 1152 ohms
a 100W 240V bulb will have 100/240 = 0.416A flowing through it so R = 240 / 0.416 = 576 ohms
But these are hot resistances and cold resistance can be even 1/8 of hot resistance.
There is some discussion here
https://www.quora.com/How-do-you-calculate-the-hot-and-cold-resistance-of-a-light-gulb(that's right, "gulb" at the end :-)
This property is often used in oscillators - a small light bulb is put in the feedback circuit to stabilise the output voltage. Hewlett Packards first product, the legendary 200A oscillator, used this method:
https://en.wikipedia.org/wiki/HP_200AHope this helps. Feel free to come back with more details!