I've ordered 2 sets off of AliExpress (the Stargate BC303 and BC304 MOCs) and was quite impressed. No box, digital instructions, and a few minor color swapped pieces; but complete and everything went together very well.
I'll throw a third (fourth, fifth because I know a couple of people who'd play this on Mac but who have no access to Linux or Windows) request for a Mac version on the pile.
On the contrary, making the tool that makes the tool is what I live for! My personal tech stack has benefited incredibly from this practice and fuels my startup, though it did take me 20 years of slow iteration to get here.
Well I’m not anti ;) … I just mean if your goal is to make the thing and you’re sure you need a tool to do it, watch out for the temptation to make the tool that makes the tool, which is the LONG way around, as OP was saying
that's really dope, but i'm not sure if it'll work out the same way nowadays. i think we're in a weird stage where momentum REALLY matters in a way that it didn't 10 years ago or 5 years down the line (probably)
It covers more than that, but it's not strictly mandatory.
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
> > Stage Restrictions: Only age-assured adults may speak on stage in servers.
Does this mean that in panel-like settings where 100s of users are listening to a speaker, in order to ask or contribute in voice you need to be verified?
What gets deemed “adult” is incredibly random as far as I can tell, some of our servers/messages have triggered it, but no porn or anything is shared in them.
Year at the time X was OPENLY posting and later selling (feature hid behind paid subscription) CSAM(1) and non consensual nudity payment processors were still okay with it.
> "Put another way, Grok generated an estimated 190 sexualized images per minute during that 11-day period. Among those, it made a sexualized image of children once every 41 seconds."
reply