Hello my name is Disaiah Bennett, I am a senior undergraduate at Elizabeth City State University (ECSU) located in Elizabeth City, North Carolina. I am a computer science major with a scientific concentration, and minor in mathematics. My passion for computer science, led to my interest in analyzing data, networking, programming, and software engineering. I plan on achieving a bachelor’s degree at ECSU and later continuing to pursue an education at a graduate level to obtain my Master's Degree in Data Analytics.
To gain and develop a strong understanding in the field of computer science, I have challenged myself by attending internships, conferences, and a hackathon, to increase my learning for the future.
Computer Scientist
(252)267-3254
lavontae.bennett@gmail.com
officialdisaiahbennett@gmail.com
Expected Graduation Date: May 2019
Hire MeSession: Fall 2017
Location: Elizabeth City State University
Session: 2016 - 2017
Location: Elizabeth City State University
Session: Spring 2018
Location: Elizabeth City State University
Session: Fall 2017
Location: Elizabeth City State University
Session: Spring 2018
Location: Elizabeth City State University
Session: Fall 2018
Location: Elizabeth City State University
Session: Fall 2017
Location: Location: Morgan State University - Baltimore, Maryland
Session: Fall 2017
Location: Atlanta, Georgia
Session: Fall 2017
Location: University of Michigan - Ann Arbor, Michigan
Session: 2016 - Present
All users on MySpace will know that there are millions of
people out there. Every day besides.
Session: 2016 - Present
All users on MySpace will know that there are millions of
people out there. Every day besides.
Session: 2016 - Present
All users on MySpace will know that there are millions of
people out there. Every day besides.
Session: 2018 - Present
All users on MySpace will know that there are millions of
people out there. Every day besides.
The Center of Excellence in Remote Sensing Education and Research (CERSER)
on the campus of Elizabeth City State University is a founding member of SGCI
led by San Diego Supercomputing Center and the Science Gateways Community Institute (SGCI).
One of the five areas of SGCI is the Workforce Development led by Dr. Linda Hayden.
Workforce Development aims to nurture the next generation of gateway users
and developers and engage the potential of students from underrepresented groups.
As science today grows increasingly computer based, it poses challenges and opportunities
for researchers. Scientists and engineers are turning to gateways to allow them to analyze,
share, and understand large volumes of data more effectively. The existence of science
and engineering gateways and the sophisticated cyberinfrastructure (CI) tools together
can significantly improve the productivity of researchers. Most importantly,
science gateways can give uniform access to the cyberinfrastructure that enables
cutting-edge science. The goal of the web development team was to increase the Interactivity
of the SGCI Young Professionals site to attract potential members and disseminate information.
This was completed utilizing WordPress Widgets to provide graphical and interactive components.
Bootstrap components (HTML, CSS, and JavaScript) were also researched for their inclusion into
the current WordPress Content Management System (CMS) and the future Liferay CMS.
Keywords: SGCI, Science Gateways Community Institute, Bootstrap,
HTML, CSS, JavaScript, WordPress, Liferay, widgets, components
Knowing the potential leakage inside SGX enclave caused by software
vulnerabilities, new understandings are needed to mitigate the threat. From that
which is known, two recent works are published on this topic. SGX-Shield at
NDSS'17 proposed load-time randomization, and "Hacking in Darkness: Return-
oriented Programming against Secure Enclaves" to appear at Usenix Security'17.
By following the research routine: a basic attack outside SGX to understand ROP,
the ROP attack inside SGX enclave, and JIT-ROP to bypass SGX-Shield. After we
can propose mitigation strategies, such as run-time live re-randomization. Both
parts of JIT-ROP inside SGX and run-time live re-randomization are research
problems we can try to solve.
Keywords: Intel SGX, ROP, Enclaves, Re-Randomization
Building a self learning Artifical Intelligence that aids individuals with therapeutic
conversation at a time of their choosing and in their own personalized space using Watson
APIs connected with IBM's Node-RED.
Keywords: Artifical Intelligence, Watson, APIs, Self Learning
In python there is an ability to write python scripts for a variety of projects.
For this project the python script was written to perform an image comparison.
This was done with various libraries that are included in the python language.
These libraries include Image, OpenCV, and etc. Thus, when importing these the
program would ask the user for two images and the size for the images the program
would display the pixel values for the images as well as display an image were
if the pixels of the two images matched the image would be "black out" for that
section. The team later found out that to compare the images the team needed to
resize the two images as well as that to have the program run on different computer
the necessary libraries needed to be on the new computer. Therefore, in the future
the team decided that the program would need to be more robust and efficient.
Keywords: Python, Image Comparison
IBM Cloud Internet Service (CIS) API was developed to process HTTP requests with the
assistance of cloudflare. With the production of the API, IBM needed the API to perform
at a higher level and produce/handle more than 100 request per second. To fix the problem,
a new python module was developed in reference to the pyCURL module. With the implementation
of the module, the system was able to produce more request with an overall improved performance.
Keywords: Pycurl, Python, HTTP, TLS, IP, Requests, Web Application Firewall
The web crawler is a open source system software, that allows the intended user to automatically
scape senior product's detail and reviews from a targeted company and stores the information.
Keywords: Web Crawler, Open Source