Having your database schema in source control is a great way--maybe the best way--to keep track of DDL changes. But source control isn't always used. And if it isn't, having a periodic backup of a database schema can be helpful. There are a number of ways to do this. Let's review a handful of those options.
Backup & Restore
Native SQL Server backups are a tried & true method to recover the DDL of a lost or altered object. Restore a backup and get whatever you need. The downside is database size: the larger it is, the more resources are needed to perform the restore. Some databases are simply too large to conveniently restore in a timely fashion. If the database is in the FULL recovery model, and log backups are available, you do have the additional benefit of obtaining the DDL from a specific point in time.
.NET Framework classes in the Microsoft.SqlServer.Management.Smo namespace can be used to generate DDL scripts for database objects. It takes some development know-how for this option. But it can be done in PowerShell, which might be comfortable for those that aren't avid developers. The Smo classes give you full control over scripting options and the objects generated. In my experience, T-SQL script generation is rather slow.
DDL triggers fire in response to a variety of Data Definition Language (DDL) events. They can be used for recording changes or events in the database schema, among other things. This fits in well with the discussion here. DDL triggers aren't for everyone, though. The data captured via EventData() is XML. There may also be a lot of "noise" captured by the trigger that you don't care about, although you can add trigger logic to omit certain events you don't care about. On the plus side, you can log *every* change and obtain the DDL statement from any point in time. Logging DDL events might also serve you well for auditing purposes.
SqlPackage.exe is a command-line utility that automates database development tasks, including creation of dacpac files that contain the schema (or schema and user data) from a connected SQL database. I've found the dacpac file generation to be fast, and the output files small enough that I can keep months worth of history on disk without ever worrying about disk space usage. Launching SqlPackage.exe from a batch file or PowerShell script is not too difficult.
This command generates a schema-only clone of a database. It was designed to investigate performance issues related to the query optimizer. But it can be used to backup schema too. It requires specifying a new data file and log file for the clone, which might require dynanic T-SQL. This can be a little tricky to script out on a recurring basis. However, having a query-ready database on hand might make sense as it's arguably the easiest option for end users.
I've used all of the above options. Backup/Restore is my least favorite. But sometimes it's a necessity. I was a big fan of the DDL Trigger option for a long time. And I do still like it for the auditing aspect. I've run into trigger code bugs and other scenarios I wasn't prepared for, though. The SMO PowerShell option is my least favorite. When I developed it, I envisioned the developers using the individual object scripts on a regular basis. That simply never happened, though. DBCC CLONEDATABASE is nice to use when you want to perform a schema compare with Visual Studio. But my favorite is SqlPackage.exe. It has a lot going for it. You can use a dacpac file for a schema compare too--either to a live database or another dacpac file. As mentioned, the dacpac file generation is fast. Every time a developer has asked for an old DDL definition, it's almost always "I need the script from yesterday". Point-in-time DDL retrieval is nice, but it's overkill.