Postwar Japan refers to the period in
Japanese history immediately following the end of
World War II in 1945 to the present day. Before and during the war Japan was known as an
empire but is now simply known as
Japan ( or
Nihon-koku, literally the
State of Japan) .