It depends on the resistance of the load you place on the circuit. Assuming that the battery, the resistor and the load are in series, the resistor and the load will form a voltage divider and the voltage across the load will be related to the ratio of the two resistances.
If you hook either or these resistors up to a battery and check the voltage with a multimeter, you will see the same voltage as the battery generates. This is because the resistance of the meter is in the tens to hundreds of millions of Ohms (typically) so the divider ratio is close to 1:1.
http://en.wikipedia.org/wiki/Voltage_dividerIf you are trying to reduce and regulate the voltage, a series resistor is not the way to do it. If you are trying to limit the current (for instance to drive an LED), this is the way to do it. For most leds, the 120 Ohm resistor is too small at 9V, and the 10K is too large. The exact value depends on the current you want to drive the LED with and the forward voltage drop across the LED (which varies by color among other things).
If you are just putting the resistor across the battery as a load, the current will follow Ohm's law (I = E / R) which will be (9 / 10000) or (9 / 120) amps in your two cases (0.9ma and 75ma). A 1/4 watt resistor will not blow up at these currents.