WEBVTT

00:07.160 --> 00:08.270
In this episode.

00:08.270 --> 00:10.100
We're going to go through social engineering.

00:10.100 --> 00:14.540
Now, you should already have the foundational knowledge of social engineering, whether it's phishing,

00:14.540 --> 00:16.340
spear phishing, phishing, or smishing.

00:16.340 --> 00:20.330
And we're going to go over those at a very high level very quickly throughout this episode.

00:20.330 --> 00:27.290
But Sisa really wants you to understand the idiosyncrasies associated with each type of social engineering

00:27.290 --> 00:32.480
attack, meaning you should expect to see scenario based questions and from those scenarios be able

00:32.480 --> 00:37.940
to identify is this type of social engineering attack a spear phishing or a phishing email?

00:37.970 --> 00:41.390
Maybe it's a phishing call or a smishing text message.

00:41.390 --> 00:44.270
It really kind of defines on how you're going through that.

00:44.270 --> 00:47.780
Don't expect to see some easy questions when it comes into this.

00:47.780 --> 00:50.480
Some of these questions are actually, quite frankly, difficult.

00:50.480 --> 00:55.760
And you need to be able to identify is this a legitimate email or it is a phishing email that really

00:55.760 --> 01:01.430
comes into play with the Sisa, and expect to see examples of those specific emails throughout the exam.

01:01.430 --> 01:06.750
Now they're not going to pound you day for day on 50 different email types, but I would expect to see

01:06.780 --> 01:11.580
at least one question in your exam with a sample email that you need to identify.

01:11.580 --> 01:13.020
Is it a legitimate email?

01:13.020 --> 01:14.490
Is it a phishing email?

01:14.490 --> 01:15.660
Is it a spear phishing email?

01:15.660 --> 01:18.210
Or is it just a regular social engineering attack?

01:18.300 --> 01:20.460
Be prepared for those types of questions.

01:21.720 --> 01:27.630
Social engineering encompasses well, phishing and phishing is just an email indirect attack, meaning

01:27.630 --> 01:29.070
that it's broad in scope.

01:29.070 --> 01:32.910
It's not really directed to an individual or a specific person.

01:32.910 --> 01:35.970
It's more kind of based towards an entire company.

01:35.970 --> 01:41.730
It's going to use things like, hey you or hi, it's going to not call you by name or title.

01:41.730 --> 01:45.900
It's not going to have any defining characteristics that pinpoint to a person.

01:45.900 --> 01:49.830
Specifically, these are phishing emails and very broad in scope.

01:49.860 --> 01:53.430
A spear phishing email, in contrast, is very direct.

01:53.430 --> 01:58.680
It's an email attack that's associated to a specific person and should include titles, your specific

01:58.680 --> 02:01.710
name, usually first and last name, but not always.

02:01.800 --> 02:03.120
It'll understand your title.

02:03.150 --> 02:08.860
They may even pull out some specific information about you, like your child's name or your specific

02:08.860 --> 02:16.240
department and ask for something like normal to start off with, which really doesn't seem that imposing

02:16.240 --> 02:17.920
from malicious actors point of view.

02:17.950 --> 02:22.720
However, most spear phishing emails are in it for the long game and not the short attack, meaning

02:22.720 --> 02:27.820
while it starts off soft, it will eventually turn into a hard kill where they're going to try to kick

02:27.820 --> 02:30.550
you on a lake or provide them with credit card information.

02:30.580 --> 02:32.770
Remember, spear phishing emails are very directed.

02:32.800 --> 02:33.730
They're going to have your name.

02:33.730 --> 02:35.680
They're going to have maybe even your phone number.

02:35.680 --> 02:41.710
They're going to have attributes associated with your position and maybe even your personal family members.

02:43.750 --> 02:49.840
With phishing or voice solicitation, these types of attacks are over the voice line for obvious reasons,

02:49.840 --> 02:51.700
but they can include robocalls.

02:51.700 --> 02:56.830
They can also include direct calls where they understand not only your name and your sensitive information.

02:56.830 --> 03:01.420
Just like a spear phishing email will, they also encompass things that are very broad in scope, like

03:01.420 --> 03:03.340
you might see on a phishing email.

03:03.580 --> 03:09.790
To be honest, they're just phishing or phishing emails and spear phishing emails rolled into a voice

03:09.790 --> 03:13.110
Solicitation, where they're trying to get you to actually talk to them.

03:13.110 --> 03:18.210
They usually encompass something where they're trying to use authority to get you to provide them information,

03:18.210 --> 03:22.980
or they're enacting upon your empathy to get you to provide information because you feel sorry for them,

03:22.980 --> 03:24.330
whatever their route.

03:24.330 --> 03:28.200
It really is a phishing attack used over a voice line.

03:28.230 --> 03:33.060
Now, there's a great example that we provided in the documentation for a YouTube link by Chris Haney,

03:33.510 --> 03:39.600
and he does a great job of showing how phishing could really work in a real world environment.

03:39.600 --> 03:45.240
Phishing can include impersonation, where they provide you a known relationship either through a coworker,

03:45.270 --> 03:47.010
a spouse, or even it.

03:47.250 --> 03:53.910
It's not uncommon with a company that I work for where they would have these malicious actors call somebody

03:53.910 --> 03:56.460
at the company and say, hey, I'm with it.

03:56.790 --> 04:02.190
I'm trying to get your computer updated to the newest whatever, I need your IP address.

04:02.190 --> 04:05.070
And they'd be like, oh, okay, well, how do I get my IP address?

04:05.070 --> 04:10.290
And the malicious actor would literally tell them to type in the CMD command and then ipconfig in order

04:10.290 --> 04:15.550
to give them the IP address to where they can literally tag into it and provide an RDP connection to

04:15.580 --> 04:15.880
it.

04:15.910 --> 04:21.160
They just then have to get the user to click on the okay for remote control their computer, and meanwhile

04:21.160 --> 04:22.930
the employee thinks that they're doing good.

04:22.960 --> 04:24.040
They're helping out it.

04:24.370 --> 04:26.650
The malicious actor is taking over their system.

04:26.650 --> 04:32.440
Social engineering, especially phishing, can be very dangerous as there is very much a call for people

04:32.440 --> 04:36.220
that speak English very well to take advantage of employees.

04:36.220 --> 04:41.560
Most people that are expecting a phishing call expect robo calls, or they expect somebody with an accent

04:41.560 --> 04:45.190
that doesn't really sound United States or English in perspective.

04:45.190 --> 04:50.380
When they're confronted with somebody that speaks English very well and is easy to talk to.

04:50.410 --> 04:53.860
These phishing attacks become very dreadful for the organization.

04:54.670 --> 04:59.500
There's something called baiting, and while we haven't really covered baiting in the past, uh, baiting

04:59.500 --> 05:02.320
is a feature where we're providing a free device in public.

05:02.350 --> 05:08.590
Usually this means I go out to a parking lot or an employee break room, and I just throw USBs on the

05:08.590 --> 05:10.270
ground for a break room.

05:10.270 --> 05:15.010
We once did a physical device where we would took a box of USB drives.

05:15.010 --> 05:19.880
We left them in the break room, and then we put a little sign that says free one terabyte or I'm sorry,

05:19.910 --> 05:23.000
26 gigabyte, uh, USB drives.

05:23.030 --> 05:23.780
Take one.

05:23.810 --> 05:27.950
You'd be surprised how many people pick those up at the break room because they thought they were.

05:27.980 --> 05:29.330
Hey, they're in the break room.

05:29.360 --> 05:31.070
What harm could come into play?

05:31.070 --> 05:33.260
It's important for banning that.

05:33.260 --> 05:38.990
You as a cybersecurity analyst not only know what it is, but you train your employees properly on the

05:38.990 --> 05:41.060
fact that we don't take free devices.

05:41.060 --> 05:47.210
This is the reason that we normally shut off USB ports on our computers inside of our offices and organizations.

05:47.210 --> 05:52.730
But banning is literally giving something for free, hoping that somebody will pick it up with a normal

05:52.760 --> 05:58.730
USB drive, there's nothing to be concerned, but with a faulty or malicious USB drive, malicious actors

05:58.730 --> 06:00.290
can put programming code on there.

06:00.290 --> 06:03.440
That happens behind the scenes that people will never see.

06:03.470 --> 06:07.040
Uh, baiting is very dangerous attack that we should be fully aware of.

06:07.070 --> 06:08.870
Then there's something called pretexting.

06:08.870 --> 06:10.790
This is a fabricated scenario.

06:10.790 --> 06:15.170
In order to obtain information, this scenario could be in a numerous different ways.

06:15.170 --> 06:19.880
You can also use Pretexting in combination with other factors of social engineering.

06:19.880 --> 06:24.850
For instance, I can pretend to be the IT department, which we just talked about, and give a fabricated

06:24.850 --> 06:30.160
scenario of I need to update your windows system that combines phishing with pretexting.

06:30.460 --> 06:36.370
You could also do that with phishing or even spear phishing, where we're providing a fabricated scenario

06:36.370 --> 06:42.430
of, hey, I'm with the HR department and I need to verify that your home information is X, Y, and

06:42.430 --> 06:42.970
Z.

06:43.000 --> 06:44.980
That information is probably made up.

06:44.980 --> 06:50.950
And if we can spoof that HR email address, we can get the potential victim to provide us not only with

06:50.950 --> 06:54.880
their home address, their telephone number, but sometimes even the last four of those Social Security

06:54.910 --> 06:55.390
numbers.

06:55.420 --> 06:58.150
Now, I want you to think about that from a malicious point of view.

06:58.150 --> 07:02.680
If I can get you to give me your home address, your telephone number, your first, last and middle

07:02.680 --> 07:07.780
name, as well as the last four digits of your Social Security number, I can pretty much open a credit

07:07.780 --> 07:12.070
card in your name, or find your banking institution and get them to open up the account.

07:12.100 --> 07:18.370
To me, this provides some serious problems within a personal perspective, but also from an organizational

07:18.370 --> 07:24.460
perspective of how many times have you called it and they asked for a specific Related information to

07:24.490 --> 07:30.490
you to reset your password so you can see from that perspective, Pre-texting plays a valuable role

07:30.490 --> 07:33.850
inside of cybersecurity from a social engineering perspective.

07:34.870 --> 07:37.420
Finally, I wanted to talk about reciprocated links.

07:37.420 --> 07:40.600
These are obscured links where we don't provide the full link.

07:40.630 --> 07:42.760
As an attacker, we provide a short link.

07:42.790 --> 07:48.070
You've probably seen these in bit.ly codes or a short URL identifier that says something like click

07:48.070 --> 07:48.670
here.

07:48.670 --> 07:54.490
Most people or most random users will just scroll over it and click the link, having no idea where

07:54.490 --> 07:55.270
it goes to.

07:55.300 --> 08:00.730
For instance, I could create a Amazon page that says you've just won a $100 Amazon gift card.

08:00.760 --> 08:03.970
Click here now, and it's just a box that says click here.

08:03.970 --> 08:09.250
We get the user to click on the link, and they go to a off site website for Amazon that looks perfectly

08:09.250 --> 08:16.480
legitimate, but instead of Amazon.com, it's Arizona, or to Inscom or an O.

08:16.510 --> 08:17.650
You get the point, right?

08:17.650 --> 08:22.750
So those different links provide a segment or an attack vector for an the user.

08:22.750 --> 08:31.560
To short come the specific Vulnerabilities of a potentially wary subscriber, meaning I'm going to try

08:31.560 --> 08:34.020
and get that wary subscriber to click on something they shouldn't.

08:34.050 --> 08:36.450
We can do URL encoders like Bitly.

08:36.480 --> 08:41.490
We can do subdomain spoofing, where I spoof the subdomain so it doesn't really look like it.

08:41.490 --> 08:46.530
I can do a homograph attack where I'm moving the information away from its primary information, and

08:46.530 --> 08:47.520
it goes somewhere else.

08:47.520 --> 08:53.400
And I could do a script redirect track where we're redirecting traffic from website over to another.

08:53.400 --> 09:00.510
I can use those different attack vectors to hide a link within a box, or even another subset of text,

09:00.540 --> 09:06.090
just because it says HTTP and it provides that link doesn't mean that I can't make those letters or

09:06.090 --> 09:08.250
that link go to a completely different website.

09:08.280 --> 09:13.440
These are all part of that attack methodology or that attack vector attack, where I'm fooling the user

09:13.440 --> 09:15.810
into doing something they normally wouldn't do.

09:15.810 --> 09:21.420
Uh, offering $100 to a low minimum wage worker isn't out of the question, and I never actually have

09:21.420 --> 09:22.410
to provide them the gift card.

09:22.440 --> 09:24.060
They'll still click on the link.

09:24.720 --> 09:27.210
Now, we talked about social engineering in this episode.

09:27.210 --> 09:28.320
We talked about obscured links.

09:28.320 --> 09:32.560
We talked about the basic foundations of the different social engineering attack vectors that you should

09:32.590 --> 09:35.770
already be aware of, and we refresh your memory on those different attacks.

09:35.800 --> 09:42.040
Remember, for Cisa, you should expect questions that are scenario based that lead you towards a specific

09:42.040 --> 09:42.970
attack vector.

09:42.970 --> 09:45.310
It's not uncommon to see one of your answers.

09:45.310 --> 09:50.560
Being social engineering, and social engineering could be the proper answer in those exams if the right

09:50.560 --> 09:51.730
example isn't there.

09:51.730 --> 09:55.960
For instance, if you see an attack methodology for spear phishing.

09:55.960 --> 10:01.180
But spear phishing isn't an answer, don't be afraid to click on that social engineering attack, because

10:01.210 --> 10:06.640
fishing, phishing, and smishing don't make sense and would still fall under that social engineering

10:06.670 --> 10:07.060
attack.

10:07.060 --> 10:07.630
Answer.

10:07.630 --> 10:11.560
And CompTIA is notorious for pulling those types of stunts within their exams.

10:11.560 --> 10:17.440
Remember, as a systems analyst, you need to not only know the concepts of social engineering, but

10:17.440 --> 10:23.020
how they're implemented within an attack vector or an attack methodology on a whole nother level.

10:23.020 --> 10:27.880
It's not enough just to be able to identify it from the different scenarios or the theoretical concept.

10:27.880 --> 10:33.520
You need to be able to look at the email and identify, is this a spear phishing email attack and why?
