Cancer is on the rise despite all the efforts of the medical establishment to allegedly fight it. The truth of the matter is that nothing is being done to prevent it and many of their toxic treatments, most notably vaccines, are making cancers much worse.