The concept of variables and constants is common in programming languages: a variable is a named memory location whose value may change over time, a constant is a location whose value will not change. In Swift, we declare variables using the keyword var, and constants using the keyword let.
var daysUntilVacation = 36
let daysInWeek = 7
Are you typing along in a playground? If not, I recommend that you do so: it’ll help you learn!
We’ve declared a variable (daysUntilVacation) and a constant (daysInWeek). We’ve also assigned a value to each: there are 36 days until our vacation, and seven days in each week. Don’t worry about the types of these values, for now, they’re just numbers.
By using the keyword var to define daysUntilVacation, we’ve let the compiler know that we intend to change this value at some point (presumably by subtracting 1 from it each day). But daysInWeek is defined as a constant, so if we try to shorten the week by subtracting 1 from it, we’ll get an error:
It’s worth taking time to look at this error (which we can inspect by clicking the red error indicator to the left of the offending line). The text of the error highlighted in red is “Cannot assign to value: ‘daysInWeek’ is a ‘let’ constant.” That makes sense, we shouldn’t be able to change a constant value. But notice also that Swift gives us an option to “fix” the problem by changing the declaration of daysInWeek to a var “to make it mutable.” In this case, that’s not what we want. The number of days in a given week is a constant; the way to fix this error is to delete the line that tries to change daysInWeek entirely.
The swift compiler is able to optimize immutable (constant) values to a much greater degree than mutable (variable) ones. For that reason, we should use var only for mutable values. For everything else (including object declarations whose property values may change – more on that later), use let.
Primitive Data Types
The type of a variable or constant must be explicitly declared in C or Objective-C. In Swift, values are strongly typed, but types can be inferred from a variable or constant declaration where no explicit type is given. Type inference in Swift follows some pretty strict rules (sometimes surprisingly strict). We’ll start with Swift’s base numeric types, then continue with other types derived from them.
If we declare a var or let with an integer value, as we did in daysUntilVacation and daysInWeek, the type is inferred to be Int. We can also declare a var to be of Int (or any other) type like so:
var frogsOnLilypad : Int
This says “there will be some number of frogs on the lily pad, but I don’t know how many. There will always be a non-fractional number of frogs on the lily pad.” At some point, we’ll have to tell the program just how many frogs there are, but we’re postponing that decision until later.
Surprisingly, it’s also legal to do this:
let numberOfLilypads : Int
This is surprising because the value of a let constant should never change. But if we think about it, it’s not really all that surprising: we haven’t assigned a value to numberOfLilypads yet. When we do, it will be immutable. (Actually, it’s immutable right now. Initially assigning a value to a let constant doesn’t mutate it, it just assigns an initial value.)
We do have an obligation however to (at some future point) assign an actual value to numberOfLilypads and frogsOnLilypad before we use them. Try this, for example:
Two errors in two lines! If you read the error text though, the cause of the errors is pretty obvious: you didn’t assign a value to these two things before you tried to use them. The error actually says the variable or constant (depending on which line you look at) was “used before being initialized.” Initialization is the process of assigning an initial value to a variable or constant.
Let’s take care of the number of lily pads first:
numberOfLilypads = 5
Pretty easy, we now have 5 lily pads, and forever after there will be 5 lily pads in our program, because
this is a constant. We should also assign a number of frogs on each lily pad:
frogsOnLilypad = 7
Now we can calculate the total number of frogs on all lily pads: var totalFrogs = numberOfLilypads * frogsOnLilypad
We declare totalFrogs to be a var because the number frogsOnLilypad is a var. If the value of frogsOnLilypad ever changes, we want the value of totalFrogs to change as well. Change the number of frogsOnLilypad to see what I mean. Just for fun, let’s change it to an impossible value:
frogsOnLilypad = -3
You should also see the value of totalFrogs change to -15. While it’s impossible for there to be -3 frogs on a lily pad in the real world, that didn’t stop Swift. So if we only allow a whole number of frogs on each lily pad, we need to change the data types of these values.
As I told you before, the base type of a non-fractional declared value is Int. An Int in Swift is a very large signed value. Swift provides signed and unsigned 8, 16, 32, and 64 bit integer values. These follow a standard naming convention:
Signed 8-bit integer
Unsigned 8-bit integer
Signed 16-bit integer
Unsigned 16-bit integer
Signed 32-bit integer
Unsigned 32-bit integer
Signed 64-bit integer
Unsigned 64-bit integer
You can see the minimum and maximum values of these types by calling the .min and .max methods of each type (try typing the code below in the playground):
You should notice that Int is equivalent to UInt32 on a 32-bit platform, and UInt64 on a 64-bit platform.
Unless you have a very specific reason to not to, always use Int for a general purpose integer in Swift. We do have a specific reason (namely, we don’t want the number of possible frogs on a lily pad to ever be negative), so we’ll change these lines:
var frogsOnLilypad : UInt let numberOfLilypads : UInt
You’ll also want to change the initialization line for frogsOnLilypad to some other value than -3, or you’ll get an error (because a UInt can’t have a negative value).
The reason we typically want to stick with one type of value (Int in this case) is that Swift’s type checking is very strong. You might think we should be able to add two integers of any type, for example an unsigned 8-bit integer 2 and a 32-bit signed integer 4. Go ahead. Try it. I dare you.
var two : UInt8 = 2 var four : Int32 = 4
two + four
Read the error. I mean really read the error. “Binary operator ‘+’ cannot be applied to operands of type ‘UInt8’ and ‘Int32.’” So what’s going on? As I said, type checking is very strong in Swift. Int and all the other integer types are classes (object types) in Swift. The arithmetic operators in each of these types can deal with only other objects of the same type; they aren’t overloaded to deal with other integer types. Of course, each of these types has a constructor, so we can get around this issue with code like:
Int32(two) + four
in other words, by creating a new object that’s a 32-bit unsigned version of the value of “two.” Sometimes we’ll need to do just these sorts of explicit type casts, but it’s usually better to avoid them altogether and use Int all the time, unless there’s a really good reason not to.
So why bring up all the signed and unsigned integer type of different bit lengths at all at this point? Well, the only real opportunity to talk about this kind of stuff is when we’re talking about the fundamental types in Swift. Down the road, you’ll be programming along using a Cocoa library class, and run into an error that says “this can’t be added to that,” or words to that effect. Cocoa (and other iOS frameworks) use values of various bit-widths for all kinds of things. I just don’t want this to come as a surprise to you later. When you run into the problem, you’ll know why it happened and how to fix it.
Strings and Characters in Swift15 Jun, 2016
Optionals in Swift02 Jun, 2016
Dictionaries in Swift16 Apr, 2016
Functions in Swift 214 Jan, 2016
Swift Arrays25 May, 2015
Curried Functions In SWIFT13 May, 2015
Indexed Table Views In Swift07 May, 2015
Creating Grouped Table Views In Swift