PromptBase
Close icon
General
Home
Marketplace
Hire
Create
Login
Sell
Explore
🤖 GPT

Debate Preparation Prompt

Debate Preparation Prompt gpt prompt mini thumbnail
6Eye icon
Views
GPT-4O
TestedTick icon
TipsTick icon
4 examplesTick icon
This prompt helps users craft compelling arguments and anticipate counterpoints specifically tailored to the side they are defending in a debate. It provides structured reasoning, supporting evidence, and rebuttals to the opposing side's anticipated claims. for more follow @pubprompt
...more
$3.99
Get prompt
After purchasing, you will gain access to the prompt file which you can use with GPT or the app builder. You'll receive 20 free generation credits with this purchase. By purchasing this prompt, you agree to our terms of service.
Over 1 month ago
Related item

Prompt Details

Model
Chat - GPT-4o (gpt-4o)
Token size
124 ($0.00620 / call)
Example input
[insert topic here]: Should Social Media Platforms Be Held Liable for the Spread of Misinformation? [choose side]: Pro (Social Media Platforms Should Be Held Liable) [list specific points]: Ethical responsibility of platforms Harm caused by unchecked misinformation Existing precedents for corporate accountability [describe tone]: Logical and persuasive [insert details]: Ethics, public health, and governance
Example output
Your Side’s Arguments Main Argument 1: Ethical Responsibility of Platforms Reasoning: Social media platforms act as gatekeepers of information and have a moral obligation to prevent harm caused by misinformation, much like traditional media. Supporting Evidence: Pew Research Center (2021) shows 53% of adults use social media for news, making it a dominant source of information. Cases like COVID-19 vaccine misinformation highlight how platforms directly influence public behavior. Impact Statement: Failing to act against misinformation undermines public trust and contributes to societal harm. Main Argument 2: Harm Caused by Unchecked Misinformation Reasoning: Misinformation spreads rapidly, leading to real-world consequences, including public health crises and election interference. Supporting Evidence: A study by MIT found that false news spreads six times faster than the truth on Twitter. WHO identified misinformation as a key barrier to COVID-19 vaccination efforts, endangering global health. Impact Statement: Holding platforms accountable would incentivize better moderation policies and safeguard public welfare. Main Argument 3: Precedents for Corporate Accountability Reasoning: Like other industries, tech companies must bear responsibility for the societal impact of their products and services. Supporting Evidence: Historical examples include tobacco companies held liable for false advertising. Section 230 reform discussions indicate growing bipartisan agreement on corporate responsibility. Impact Statement: Accountability frameworks ensure that corporations prioritize societal well-being alongside profit. Rebuttals to Opposing Side Anticipated Opposing Claim 1: “Holding platforms liable infringes on free speech.” Counter-Rebuttal: Liability focuses on curbing harmful misinformation, not legitimate discourse. Platforms already moderate content (e.g., hate speech bans), proving liability doesn’t inherently suppress free speech. Anticipated Opposing Claim 2: “It’s impossible to monitor all content on large platforms.” Counter-Rebuttal: AI moderation tools and human oversight, while not perfect, can be improved with investment. Successful moderation models, such as Facebook’s response to certain crisis misinformation, show feasibility. Anticipated Opposing Claim 3: “Users, not platforms, should be responsible for content.” Counter-Rebuttal: Platforms profit from user content and algorithmically amplify misinformation, making them complicit. Liability encourages platforms to create safer environments and enforce clearer policies. Delivery Strategy Preferred Tone: Maintain a logical and persuasive tone, emphasizing accountability and real-world impacts. Structuring Arguments: Begin with ethical responsibility to set the moral high ground. Follow with the tangible harms of misinformation to establish urgency. Conclude with precedents for accountability to reinforce feasibility and fairness. Engaging the Audience: Use statistics and expert quotes to establish credibility. Pose rhetorical questions like: “If tobacco companies can be held accountable for public harm, why not social media?”
Browse Marketplace