Access and exposure to a diversity of voices is a fundamental pillar of democracy. On social media markets, exposure diversity is limited by various factors, one being how platforms curate the content each user sees. The algorithmic systems for content curation used by those platforms artificially can reduce exposure diversity to the benefit of profit-making parameters. This paper describes the challenge, suggests two possible regulatory solutions to address it, and proposes an analytical framework to assess and compare them. The first is to regulate content curation in a way that guarantees diversity. The second is to unbundle hosting from content curation activities, and to oblige large platforms to allow third parties to offer content curation to users. The two remedies come from different normative paths and require different enforcement mechanisms, among others. The preliminary conclusion is that the unbundling might be a better option.