WEBVTT

00:07.220 --> 00:08.780
Welcome everyone.

00:08.900 --> 00:12.260
I am with Richard Chapman, with Cyber Now Labs, a SOC manager.

00:12.290 --> 00:14.420
Richard, what does a Cyber Now lab?

00:14.450 --> 00:15.770
SOC manager do?

00:16.610 --> 00:22.820
My main focus is to work with the analysts to help them understand security events.

00:22.820 --> 00:27.710
Basically investigate those security events and really kind of help keep the network safe and secure,

00:27.710 --> 00:31.730
which is what most security operations center analysts do around the globe.

00:32.420 --> 00:32.750
Great.

00:32.750 --> 00:38.780
So Cyber Now Labs teaches students cyber security and in a real SOC environment, is that correct?

00:38.780 --> 00:43.370
Can you kind of expand on that and what you teach your students, what tools they would be utilizing?

00:43.460 --> 00:44.540
Yeah, absolutely.

00:44.540 --> 00:49.760
So first and foremost, it's always good to have real, at least as real a situation as you possibly

00:49.760 --> 00:53.060
can when you're learning how to be a security operations center analyst.

00:53.060 --> 00:59.420
And what we've done is we've basically created a real network with real security tools, enterprise

00:59.420 --> 01:01.870
grade tools that you would see in any SoC around the world.

01:01.870 --> 01:05.800
And our analysts get to investigate real security events.

01:05.800 --> 01:11.140
So they're investigating real attacks on our network, and it gives them the real experience that you

01:11.140 --> 01:16.720
need to be able to say, I know how to work through a security event investigation, do incident response

01:16.720 --> 01:22.180
on that, make good recommendations on what needs to be done to help either get that threat actor out

01:22.180 --> 01:26.890
or block that activity from being successful in the future.

01:27.700 --> 01:33.190
So is it fair to say then, that within your SOC environment that you guys start literally from the

01:33.190 --> 01:39.760
foundation of an event occurring all the way to finishing the event and the paperwork inside, can you

01:39.790 --> 01:43.030
kind of walk us through that process?

01:43.060 --> 01:44.170
Yeah, absolutely.

01:44.170 --> 01:49.720
Um, the incident response process starts with preparation, and that means having everything in place

01:49.720 --> 01:52.390
to be able to detect these, these activities.

01:52.390 --> 01:58.900
So our tools are designed to look at activities that are happening in and around our network.

01:58.900 --> 02:06.060
And they generally have rules that they get broken which create alerts, detections, notables based

02:06.060 --> 02:06.840
on the tool.

02:06.840 --> 02:12.330
And our analysts are investigating those activities so they get to see them start out.

02:12.330 --> 02:18.660
Then they understand exactly why it triggered and then they investigate those events to be able to determine

02:18.660 --> 02:24.870
is this actually malicious behavior, or is it a false positive where you've got something triggering

02:24.870 --> 02:27.000
that shouldn't necessarily be triggering?

02:27.000 --> 02:31.050
And if it does, then what can we do to help minimize that in the future as well?

02:31.050 --> 02:37.860
So it is basically from a ground up kind of perspective, um, here's a SoC environment, here's the

02:37.860 --> 02:38.340
tools.

02:38.340 --> 02:41.550
But let's start with the investigation from the very beginning.

02:41.550 --> 02:46.620
Work all the way through it to even being able to, you know, make recommendations on what needs to

02:46.620 --> 02:48.510
be done to help secure the environment.

02:49.260 --> 02:55.170
Now, I know from my own personal experience that once you get involved into the education portion of

02:55.170 --> 03:00.600
it, a lot of times you kind of fall behind on the trends and what's going on in real life.

03:00.600 --> 03:03.520
How does how does your SOC differ from that aspect?

03:03.520 --> 03:05.800
Because I know that you've got a live action SoC.

03:05.800 --> 03:09.520
How do you keep up to date with all the newest security threats that are coming up to bear?

03:10.090 --> 03:14.110
Well, there's there's a lot to to do to actually stay on top of things.

03:14.230 --> 03:19.030
We are a smaller SoC so we don't get to see the advanced persistent threats.

03:19.180 --> 03:25.870
Um, so we're not getting to see maybe as many of those emerging threats as a larger environment like

03:25.870 --> 03:29.140
a well-known enterprise might see on a daily basis.

03:29.260 --> 03:35.440
So what we generally try to do is we try to make sure that we're recommending to our analysts to read

03:35.440 --> 03:41.260
cyber security news, you know, envelop themselves in the threat intelligence that is available out

03:41.260 --> 03:41.980
there in the world.

03:41.980 --> 03:49.240
We also recommend a bunch of different sources for them to read daily so that they see that activity.

03:49.240 --> 03:54.550
And then we also do have some virtual environments where, believe it or not, we actually can create

03:54.550 --> 03:56.770
some of those activities for them.

03:56.800 --> 04:01.330
Now that's a little bit different than a simulated environment that, you know, a lot of I guess you

04:01.330 --> 04:06.690
could say other opportunities to learn how to be a cybersecurity analyst might provide because those

04:06.690 --> 04:07.980
generally are static.

04:07.980 --> 04:09.960
It's not it's not new.

04:09.960 --> 04:11.160
It's not recreated.

04:11.160 --> 04:12.330
It's not regenerated.

04:12.330 --> 04:14.940
It's kind of the same thing every single time.

04:14.940 --> 04:21.570
Whereas we can go in and download a brand new piece of malware that just became available, executed

04:21.570 --> 04:24.960
in an environment, and actually let our analysts see what it does.

04:24.960 --> 04:31.260
So we've got many different angles, I guess that you could say we approach it from so that our analysts

04:31.260 --> 04:33.240
are getting the most up to date knowledge.

04:33.240 --> 04:39.510
They're they're able to learn as best they possibly can and basically come out of our program being

04:39.510 --> 04:44.910
a solid cybersecurity analyst, that's kind of gone through almost like a trade school type mentality.

04:45.750 --> 04:50.130
So you're going through and you're teaching your students stock, but you're also operating on a stock

04:50.130 --> 04:53.220
in real time with real threats simultaneously going on.

04:53.220 --> 04:56.130
So students get their hands dirty right at the get go.

04:56.190 --> 05:02.930
Uh, can you kind of explain how prioritization or triage works in your SOC Environment.

05:03.500 --> 05:04.190
Absolutely.

05:04.190 --> 05:07.580
So first off, we do teach many different types of tools.

05:07.580 --> 05:10.880
And one of the tools that we utilize is called a Siem tool.

05:10.880 --> 05:12.470
It's basically an event manager.

05:12.470 --> 05:17.060
And what it's doing is it's looking at the different alerts that are happening around our environment,

05:17.060 --> 05:19.790
and it gives our analysts a single pane of glass.

05:19.790 --> 05:22.610
So that means they can monitor that dashboard.

05:22.610 --> 05:28.820
And when a new offense or a new notable pops up on the dashboard, they can begin investigating that.

05:28.820 --> 05:30.860
Now that could be an email alert.

05:30.860 --> 05:33.020
It could be an endpoint detection alert.

05:33.020 --> 05:38.870
It could be some sort of brute force attack by a, you know, a device somewhere out there in the internet

05:38.900 --> 05:42.860
trying to see if it can brute force its way into one of our systems.

05:43.040 --> 05:47.270
So it gives them one spot that they can focus on and monitor.

05:47.270 --> 05:55.250
And those alerts are ranked, and generally they have some sort of threat scale to them, you know,

05:55.280 --> 05:59.840
a 0 to 10 high severity versus an informational severity.

05:59.840 --> 06:05.950
And those informational alerts might not be the main focus, whereas a critical alert might be one they

06:05.950 --> 06:11.470
really want to focus on because it's a potential critical issue that they need to jump on quickly.

06:11.470 --> 06:14.140
But we also do teach our analysts how to think.

06:14.140 --> 06:17.350
So just because it says critical doesn't mean it necessarily is.

06:17.350 --> 06:21.610
That could be an issue with the way that everything is tuned in the rules.

06:21.760 --> 06:26.410
So we obviously approach that from the realistic perspective as well too.

06:26.440 --> 06:30.700
We've had many situations where a rule was created and right off the bat it was wrong.

06:30.700 --> 06:32.830
So it was generating a lot of false positives.

06:32.830 --> 06:37.210
So they have to be able to think through that process just like any analyst would in an environment.

06:37.540 --> 06:39.310
Let me shift gears a little bit here.

06:39.310 --> 06:44.290
So we talked about, uh, the triage effect and a lot of the tools which we're going to get into depth.

06:44.290 --> 06:46.810
And I'm going to ask you to show off some of those tools in a few minutes.

06:46.810 --> 06:54.340
But how can you kind of discuss how an incident occurs from start to finish?

06:54.370 --> 06:54.610
Right.

06:54.640 --> 06:59.920
We talked about how their data goes, but but how would you handle an incident from start to finish

06:59.920 --> 07:04.080
from the ticketing perspective of and communication.

07:04.350 --> 07:09.660
I think the first step for an analyst is to be able to understand what they're seeing in the alert.

07:09.660 --> 07:15.180
So if, for instance, if they're looking at one of our Siem tools and they're looking at an offense,

07:15.180 --> 07:18.990
it's going to give them artifacts that are associated with that offense.

07:18.990 --> 07:22.200
It's going to give them the date, the time, how many events.

07:22.200 --> 07:28.980
There were different pieces of information that allow the analyst to start to put together the picture

07:28.980 --> 07:30.030
of what happened.

07:30.030 --> 07:35.310
So that's kind of the first part of it, basically identifying that activity and really understanding

07:35.310 --> 07:37.350
it, detecting it and then identifying it.

07:37.350 --> 07:43.470
Then that analyst is going to start to really investigate and analyze what they're seeing, and they're

07:43.470 --> 07:50.190
going to start to utilize open source intelligence, their understanding of network protocols, their

07:50.190 --> 07:55.290
understanding of the systems that we have in our environment, their understanding of the device that's

07:55.320 --> 07:57.270
actually carrying out the activity.

07:57.270 --> 08:02.910
If it's an external device, they're going to start to paint the picture as to what's going on at that

08:02.910 --> 08:03.480
point.

08:03.510 --> 08:04.880
Then they're going to look and see.

08:04.880 --> 08:09.950
Was any of that action allowed, was it successful or was it blocked.

08:09.950 --> 08:13.820
And at that point it gives them the opportunity to then make a declaration.

08:13.820 --> 08:17.660
Is this a true positive attack that we need to take action on now?

08:17.660 --> 08:23.750
Was it a true positive attack from a malicious actor that was blocked effectively by the security tools?

08:23.750 --> 08:25.040
That's a good thing?

08:25.250 --> 08:31.250
Or is it a false positive like I mentioned earlier where the rule triggered, but it might not necessarily

08:31.250 --> 08:37.460
should have triggered once they make a clear declaration, a lot of times it's going to be ticketing

08:37.460 --> 08:37.760
next.

08:37.760 --> 08:42.020
So that means taking the steps to grab all of those artifacts right there.

08:42.020 --> 08:47.240
Ticket painting the picture of what happened, really kind of telling the story on this day and this

08:47.240 --> 08:55.310
time, this IP, uh, triggered a detection in our environment by carrying out this activity on this

08:55.340 --> 08:56.240
internal device.

08:56.240 --> 08:56.870
Right.

08:56.930 --> 09:00.410
Um, it's it's kind of telling the story, so to speak.

09:00.410 --> 09:05.220
And depending on what needs to be done, the analyst is going to be able to make recommendations.

09:05.220 --> 09:07.320
They're going to say, this is what needs to be done.

09:07.320 --> 09:08.610
We need to block this URL.

09:08.610 --> 09:11.670
We need to block this IP or add this IP to a watch list.

09:11.670 --> 09:15.090
Or maybe this user needs to have their password reset.

09:15.120 --> 09:18.690
So maybe they're going to send a ticket over to the help desk to accomplish that.

09:18.690 --> 09:24.210
So it's really kind of stepping through from again the very beginning when you're just getting details

09:24.210 --> 09:28.800
to understanding it, investigating it, making the declaration and then being able to write an effective

09:28.830 --> 09:34.200
ticket for whoever needs to see it, whether it's somebody in the business arm of the of the company,

09:34.200 --> 09:41.220
or whether it's the next tier analyst who's going to dig in deeper and really, really get to the fine

09:41.220 --> 09:47.910
granular details that maybe a lower level analyst might not be able to understand in their early stages

09:47.910 --> 09:49.350
of being a SOC analyst.

09:50.010 --> 09:55.620
So for me personally, I often get asked by brand new people just trying to get into cybersecurity.

09:55.620 --> 09:59.490
Maybe they already have security flaws, but they're trying to get in their first step of cybersecurity.

09:59.490 --> 10:05.150
And I often get asked, oh, I just want to work in the basement with the hoodie on and the sunglasses.

10:05.180 --> 10:06.650
I mean, obviously it's not that bad, right?

10:06.680 --> 10:12.470
But but I get this mentioned that a lot of people have this misnomer, at least in my eyes, of thinking

10:12.470 --> 10:16.880
that I'm going to get into cybersecurity and I get to work alone in the basement, and I don't have

10:16.880 --> 10:17.660
to deal with anybody.

10:17.690 --> 10:20.780
How true or false is that narrative?

10:21.080 --> 10:28.430
Well, I will say, in today's post-Covid world, working alone at home in a dark office can be it can

10:28.430 --> 10:30.050
be a real situation.

10:30.200 --> 10:34.880
Um, but there, even with that being said, there's a lot of collaboration.

10:34.910 --> 10:35.600
Uh, yes.

10:35.600 --> 10:39.590
There are times where you're going to do independent investigations, independent work.

10:39.710 --> 10:42.650
But, you know, collaboration is key in a SOC.

10:42.770 --> 10:45.320
We know analyst knows everything.

10:45.320 --> 10:52.610
And to be able to pull in an extra set of eyes, or maybe even a team of extra analysts who have got

10:52.610 --> 10:56.240
a lot more years of experience put together than you do by yourself.

10:56.270 --> 11:02.390
It can be the difference between identifying a real threat and maybe missing that threat.

11:02.420 --> 11:07.150
So individualized work does happen, but collaboration is absolutely key.

11:07.180 --> 11:11.920
You have to be able to effectively communicate with your team to be able to even, like I mentioned

11:11.920 --> 11:13.360
earlier, paint that picture.

11:13.360 --> 11:17.890
Tell your team what you're seeing, what you've been able to identify.

11:17.980 --> 11:25.000
Um, maybe where some of your hangups are, what your thoughts are, and, and, and allow them to learn

11:25.000 --> 11:28.900
from what you've already done so they don't have to redo the investigation.

11:28.900 --> 11:36.490
They can pick up right where you left off and provide additional guidance and or knowledge and direction.

11:37.210 --> 11:41.770
So it sounds like teamwork is really one of the the core fundamentals of being an a SoC.

11:41.800 --> 11:47.200
I know for me personally, especially very early on when I was getting first into it and first into

11:47.200 --> 11:53.710
telecom, that I kept on getting pressured by my own superiors to say it is not a solo sport, it is

11:53.710 --> 11:57.010
very much a team sport and you can't know everything all the time.

11:57.010 --> 12:02.140
Would you agree with that sentiment that not only is collaboration and communication important, but

12:02.140 --> 12:07.710
being able to stand up and say I don't know, let me ask somebody and let me get the right answer.

12:07.740 --> 12:12.090
Can you kind of go into depth, maybe with a scenario or something, what your own experience tells

12:12.090 --> 12:12.870
about that?

12:12.900 --> 12:15.150
Yeah, I will tell you 100%.

12:15.150 --> 12:16.650
I agree with that statement.

12:16.830 --> 12:23.340
Um, I've had situations like that myself where I'm working with, uh, analysts in a SoC shift, and

12:23.340 --> 12:27.900
one of them will escalate something to me, and I'm like, I've never seen this before, so you have

12:27.900 --> 12:32.010
to start digging in and really trying to understand exactly what's going on.

12:32.010 --> 12:37.740
And I always start off with telling my analysts, it's very important to understand who's involved,

12:37.740 --> 12:40.260
the who, the what, the when, the where, the why.

12:40.290 --> 12:40.860
Right.

12:41.070 --> 12:45.720
The more of that that you can put together, the better you're going to understand the possibilities.

12:45.900 --> 12:50.700
Um, but yeah, there are absolutely situations where you're like, I don't understand why this is showing

12:50.700 --> 12:55.380
up this way or why this is what it is, or I've never seen this process before.

12:55.410 --> 12:57.780
Let's go figure out what this process is.

12:57.810 --> 13:01.350
And, you know, we try to to make our analysts self-sufficient.

13:01.350 --> 13:08.960
So we absolutely try to help them be resourceful so that they will take as many steps as possible individually,

13:08.990 --> 13:10.940
on their own, as an analyst first.

13:10.970 --> 13:13.640
I mean, you have to be able to do that to a certain degree, right?

13:13.640 --> 13:17.990
But you do reach a point where you go, I need help.

13:17.990 --> 13:19.400
I don't know what this is.

13:19.400 --> 13:22.310
I cannot make heads or tails of this.

13:22.310 --> 13:24.590
I think it's this, but it looks like this.

13:24.590 --> 13:26.990
I can't find any information on that.

13:27.020 --> 13:33.410
Whatever the case may be, there are times where you hit a point where your knowledge is exhausted and

13:33.410 --> 13:39.560
your research is even not presenting additional opportunities to learn what could be going on.

13:39.560 --> 13:42.020
So you have to be able to ask for help at that point.

13:42.020 --> 13:46.760
It's a valuable, uh, valuable tool that I think that analysts have to have.

13:47.630 --> 13:48.080
Awesome.

13:48.080 --> 13:50.480
So we talked a little bit about incident response.

13:50.480 --> 13:52.790
We talked a little bit about teamwork and collaboration.

13:52.790 --> 13:58.970
I'm going to totally shift gears on you again and ask about compliance, because I know from from my

13:59.000 --> 14:04.880
own personal perspective and the many, many years that I worked in it, compliance is a big issue.

14:04.910 --> 14:09.960
How does a sock help meet those compliance requirements that we see?

14:10.230 --> 14:15.360
First off, I would agree that compliance is not only important, it's becoming more important on a

14:15.360 --> 14:16.410
daily basis.

14:16.710 --> 14:21.930
I actually was reading an article this morning about a very well-known tech company who is actually

14:21.930 --> 14:29.670
going to start holding their executives responsible for major incidents that that happen.

14:29.670 --> 14:35.460
In other words, it's going to put pressure on executives to ensure that their security teams are following

14:35.460 --> 14:36.840
best practices.

14:36.840 --> 14:43.020
They're utilizing frameworks like the NIST framework to be able to ensure that they have the best possible

14:43.020 --> 14:50.610
security in place to do everything possible to keep from breaches happening.

14:50.610 --> 14:55.320
So I think there's going to be a much larger level of accountability as we move forward.

14:55.320 --> 15:02.550
As a SOC analyst, it's it's obviously our job to do the investigations and to report on our findings.

15:02.550 --> 15:04.230
So the better we can do at that.

15:04.230 --> 15:10.610
And the more that we can actually provide those details to the tier twos, to the SOC managers, to

15:10.640 --> 15:11.630
the CSOs.

15:11.630 --> 15:18.080
That's going to allow them to make better decisions and implement better, better security measures

15:18.080 --> 15:21.080
across the across the, you know, across the network.

15:21.110 --> 15:21.500
Right.

15:21.500 --> 15:24.920
So I think we play a key role in that.

15:24.920 --> 15:26.930
We are that initial line of defense.

15:26.930 --> 15:34.340
We're looking at things from the beginning and everything kind of gets passed up to the the next level,

15:34.340 --> 15:38.030
and it gets processed all the way up to the highest levels.

15:38.030 --> 15:43.640
Who are running the whole operation, and they need the best information possible, and they need to

15:43.640 --> 15:48.800
be able to compare what they're doing and what they're seeing to those frameworks, to be able to keep

15:48.800 --> 15:53.060
the environment as safe and the data as safe as possible.

15:53.930 --> 15:56.900
Richard, thank you so much for your time today and the next episode.

15:56.900 --> 16:01.910
You're actually going to show us some tools related specifically to a SoC environment and dive into

16:01.910 --> 16:02.840
those complexities.

16:02.840 --> 16:06.020
So thank you so much for your time Thank you for having me.

16:06.020 --> 16:06.650
Pleasure.
