Why do these stupid white women seem to think they just have to have a boob job? It makes no sense. I always thought that Kate was a beautiful woman. She was a real woman not a plastic image. I understand that she possibly has had tummy tuck. Maybe some women think they are worthless unless they have bulging boobs screaming “Look at me!”
I am black man that loves white women . I think she looks nice with or without her new boobs. Don’t hate appreciate.