I read a bunch of article about what’s the next greatest benefit to offer employees. I read one the other day that tried to make it seem like now offering food at work is normal, like everyone is giving away breakfast and lunches, like you give away health insurance.
It’s the one thing I hate about reading mainstream media HR articles. Apparently, the only employers in America are located in the 50 square miles around Silicon Valley. Do you really think I believe that the majority of companies in America are giving away free food to their employees?