Seminars

The seminars generally will be held every Friday at noon in Room 105, College of Engineering West Hall on the Monroe Park Campus. For more information, please contact our seminar coordinator - Carol Fung, Ph.D.

Use Adobe Acrobat Reader to download the seminar fliers and collaterals.

If you need assistance accessing any of the information on this page, please contact the Marketing & Communications department.

Spring 2018

Title: Monte Carlo Algorithms for Numerical Linear Algebra and Their Applications in Bioinformatics

Friday, 5/11/18 | Noon-1 pm | West Hall, W105

Speaker: Yaohang Li, Ph.D.

Abstract: In the era of “big data,” numerical linear algebra operations related to large matrices are behind many applications, ranging from data mining to large-scale simulations and machine learning. The Monte Carlo methods for numerical linear algebra, which are based on statistical sampling, exhibit many attractive properties, including fast approximated results, memory efficiency, reduced matrix element accesses, and intrinsic parallelism. As a result, there has been a recent increase of interest of using Monte Carlo methods to carry out computations on large matrices. In this talk, we will present Monte Carlo methods for approximating solutions for problems associated with large matrices, including singular value decompositions, extreme eigenvalues/eigenvectors, matrix-matrix multiplications, and solving linear systems. Bioinformatics applications including drug repositioning, lncRNA-disease associations, and protein structure modeling are also discussed.

Bio: Yaohang Li is an Associate Professor in the Department of Computer Science at Old Dominion University. He received the Ph.D. degree in Computer Science from the Florida State University in 2003. He is an NSF CAREER Award recipient. His research interests are in Computational Biology, Monte Carlo methods, Big Data Analysis, and Parallel/Distributed Computing.

Title: Towards Secure Big Data Computing

Friday, 5/04/18 | 11 am-Noon | West Hall, W105

Speaker: Changqing Luo, Ph.D.

Abstract: We are now in the big data era. We are able to collect more data than ever before from various systems and applications, such as the Internet of Things, cyber-physical systems, smart cities, and smart healthcare. Analyzing such data usually requires intensive computations, which poses a great challenge for us. Cloud computing is an efficient and economical way to overcome this limitation.However, it raises the security and privacy concerns because users’ data may contain sensitive information for ethical, legal, or security reasons. Some previous works attempted to address these concerns and proposed algorithms that can be classified into two categories: traditional cryptography based methods and linear transformation based methods. Unfortunately, these algorithms are not efficient for practical big data applications because the former still bears high computational complexity while the latter incurs high communication cost and has no security guarantee. To this end, I investigate how to efficiently solve big data computing problems with low computation and communication cost, while protecting data security. Solving these problems can lay the foundation for supporting secure applications in many different areas, including medical, power systems, simulations and engineering. In this talk, I focus on how to design a secure outsourcing algorithm for matrix factorization, one of the most fundamental and important big data computing problems.

Bio: Changqing Luo is currently a Ph. D. student in the Department of Electrical Engineering and Computer Science at Case Western Reserve University. He received his B.E. and M.E. degree in Telecommunication Engineering and Communication and Information Systems from Chongqing University of Posts and Telecommunications, Chongqing, China, in 2004 and 2007, respectively, and his previous Ph. D. degree in Communication and Information Systems from Beijing University of Posts and Telecommunications, Beijing, China, in 2011. He was a Lecturer at the School of Computer Science and Technology, Huazhong University of Science and Technology in China from 2011 to 2013.

Title: Big Data Bridge

Monday, 4/30/18 | 11 am - 12 pm | West Hall, W106

Speaker: Dr. Justin Zhan

Abstract: Data has become the central driving force to new discoveries in science, informed governance, insight into society, and economic growth in the 21st century. Abundant data is a direct result of innovations including the Internet, faster computer processors, cheap storage, the proliferation of sensors, etc, and has the potential to increase business productivity and enable scientific discovery. However, while data is abundant and everywhere, people do not have a fundamental understanding of data. Traditional approaches to decision making under uncertainty are not adequate to deal with massive amounts of data, especially when such data is dynamically changing or becomes available over time. These challenges require novel techniques in data analytics, data-driven optimization, systems modeling and data mining. In this seminar, a number of recent funded data analytics projects will be presented to address various data analytics, mining, modeling and optimization challenges. In particular, the DataBridge, which is a novel data analytics system, will be illustrated.

Bio: Dr. Justin Zhan is a professor at the Department of Computer Science, College of Engineering, Department of Radiology, School of Medicine, as well as Nevada Institute of Personalized Medicine, University of Nevada Las Vegas. His research interests include Big Data, Information Assurance, Social Computing, Biomedical Computing and Health Informatics. He has been a steering chair of International Conference on Social Computing (SocialCom), and International Conference on Privacy, Security, Risk and Trust (PASSAT). He has been the editor-in-chief of International Journal of Privacy, Security and Integrity and International Journal of Social Computing and Cyber-Physical Systems. He has served as a conference general chair, a program chair, a publicity chair, a workshop chair, or a program committee member for over one-hundred and fifty international conferences and an editor-in-chief, an editor, an associate editor, a guest editor, an editorial advisory board member, or an editorial board member for about thirty journals. He has published more than two hundred articles in peer-reviewed journals and conferences and delivered thirty keynote speeches and invited talks. His research has been extensively funded by National Science Foundation, Department of Defense and National Institute of Health.

Title: CRTC’s efforts on Exascale-Era Grid Generation for Aerospace Industry and Real-Time Image-to-Mesh Conversion for Health Care Industry

Wednesday, 4/18/18 | 11 am - 12 pm | West Hall, W105

Speaker: Dr. Nikos Chrisochoides

Abstract: Computational and recently Data Sciences together form one of the three pillars complementing traditional theoretical and physical experimental studies in science and engineering. Parallel finite element mesh (grid) generation is one of the critical building blocks for this pillar. Parallel grid generation is a new research area transcending the boundaries of two disciplines: computational geometry and parallel computing. In this talk, first we examine how our efforts (over the last 25 years) are changing parallel grid generation and then we look forward in terms of: (i) trends in high performance computing and (ii) challenges in aerospace industry that will influence the basic research in this area for the next 15 years. In the second part, we will focus on health care industry, since today’s regulatory changes affect tomorrow’s disruption in the health care. Namely, we will briefly describe our work on image-to-mesh conversion and its applications in Brain Traumatic Injuries, design of stents for brain aneurysms, and accurate fusion of 3D/4D medical images.

Bio: Nikos Chrisochoides is the Richard T. Cheng Distinguished Professor of Computer Science at ODU and John Simon Guggenheim Fellow in Medicine & Health. He was elected Distinguished Visiting Fellow in the Royal Academy of Engineering in the UK. His current research interests are in exascale/quantum and real-time medical image computing. Nikos received his Ph.D. in 1992 from Computer Science at Purdue University. He worked at Northeast Parallel Architectures Center in Syracuse and Advanced Computing Research Institute at Cornell University. In 1997 he joined the Computer Science & Engineering Dept. at the University of Notre Dame where he received his NSF CAREER Award. In 2000 he joined the College of William and Mary where he was awarded the Alumni Memorial Professorship. He has held visiting positions at MIT, Harvard Medical School and Brown University. He generated as a PI and Co-PI more than $14.5 million for his research group on high-performance and medical image computing and he has more than 240 publications.

Title: Pervasive Sensing and Ubiquitous Water Management - How to prevent drowning … in data

Wednesday, 4/04/18 | 10-11 am | West Hall, W105

Speaker: Vladan Babovic

Abstract: We live in times characterized by pervasive sensing. Data are collected all the time, everywhere. In addition to fixed in situ sensors that record light, temperature, pollution and other environmental factors, proliferation of personal sensors like Fitbit wristbands record the location, activity, and physiology of individuals. Smart phones and citizen scientist networks provide crowd-sourced sensing of the environment and infrastructure.

Widespread use of video cameras, lidars and RFID technology are already monitoring the movement of pedestrians, vehicles as well as water levels in rivers, canals and sewers. In urban environments this large increase in the use of sensory networks facilitating data collection resulted in a shift towards so-called Smart City paradigm. The literature claims that the implementation of data rich, intelligent infrastructure will lead to vast improvements in the urban environment, ranging from de-carbonisation, increasingly efficient provision of government services and safer urban areas. Clearly, the data collected within Smart Cities may be extremely beneficial in informing local decision-makers. However, data itself does not constitute value. As a matter of fact, very little of the data are analyzed for purposes other than forensics or revenue generation. True value of the data needs to be derived through machine learning and other high-level processing of raw observations and fusion with the physical, biological, and informational sensing of the city. This talk describes a number of innovative approaches to harnessing big city data. Together, these applications form a new versatile water management modelling instrument which not only supports operational water management, but decision making processes and dissemination of information related to public during disasters.

Bio: Vladan is a leading scientist in the field of hydroinformatics where he has been spearheading research in datadriven research and computer modeling of hydraulics and hydrological phenomena from early 1990s. In more recent years, his work on flexibility and real options pertaining to decision-making under deep uncertainties in water- and climate-related domains is gaining wider recognition. In addition to being a leading researcher and educator, Vladan is a scientist entrepreneur who was instrumental in securing funding and subsequently lead establishment and managed growth of research institutes, 65 million Singapore Delft Water Alliance (SDWA) and NUSDeltares, for both of which he served as the founding Director. Under his leadership SDWA and NUSDeltares were recognized in March 2014 by prestigious Winsemius Awards. He is a Chartered Engineer and a Member of the Institution of Engineers (Singapore)

Title: Accurate Identification of Long Non-Coding RNAs Using a Neural Network

Friday, 3/30/18 | 12-1pm | West Hall, W105

Speaker: Dr. Gang Hu

Abstract: Long non-coding RNAs (lncRNAs) play important roles in a variety of biological processes. The rapid development of RNA-based transcriptome assembly and next-generation sequencing technology presents an opportunity to discover novel lncRNAs in a variety of currently sequenced species, especially for the species that lack gene annotations. We have developed an alignment free, annotation independent tool, Long Noncoding RNA Seeker (LNRSeeker), which accurately distinguishes between protein coding RNAs and lncRNAs. LNRSeeker combines a neural network with a rich feature-based representation of transcripts. We have evaluated our method on seven transcriptomes, including human. The results show that LNRSeeker achieves MCC = 0.94 using 10-fold cross-validation on the human dataset, and an average MCC = 0.8 on the other 6 species when using the model built from the human dataset. Compared with state-of- art-tools for identification of lncRNA, such as CNCI, CPAT, CPC and PLEK, our method achieves the highest MCC on human and for 5 out of the 6 other considered species..

Bio: Gang Hu is an Associate Professor in the department of Data Science and Informatics at the Nankai University in China. He is also a visiting Associate Professor in the department of Biostatistics, Epidemiology and Informatics at the University of Pennsylvania. His research focuses on bioinformatics. In particular, he is interested in computational structural biology and RNA-seq analysis. He received his Ph.D. in probability theory and mathematical statistics from the Nankai University in 2005.

Title: Data-driven and Model-based Analysis for Intelligent Infrastructure: Two Case Studies

Friday, 2/23/18 | 12-1pm | West Hall, W105

Speaker: Dr. Liang Cheng

Abstract: In this talk, I will provide two case studies of datadriven and model-based approaches to enabling intelligent infrastructure. One is how to analyze cyber-physical systems affected by random processes against deterministic performance requirements using power substation automation systems as an example. The research was initiated from discussions with industry collaborators when they were helpingtheir customers to modernize substation automation systems for smart grid applications by adopting international standard IEC (International Electro-technical Commission) 61850. The other is how to realize temporally and spatially continuous underground sensing, a research project sponsored by the NSF (National Science Foundation). The research was developed with a vision of proactive maintenance for embankments to fight flooding, for targeted instrumentation of pipelines to avoid leakage-induced disasters, and for continuous checking of fuel tank surroundings to prevent serious pollution.

Bio: Prof. Liang Cheng from Lehigh University focuses his research on enabling intelligent infrastructure based on real-time sensing and model-driven data analytics through interdisciplinary projects, such as cyber security and smart grid (DOE and PITA projects), infrastructure monitoring (NSF and PITA projects), and IoT middleware (an NSF project seeded by his work on networked appliances in 2000). He is also an expert in ad hoc networks (NSF, DARPA and PITA projects). His research has also been funded by industry partners such as ABB, Agere Systems, East Penn Manufacturing, and PPL. He has advised 6 Ph.D. students to their graduation, supervised 2 postdocs, advised 22 Master’s degree theses, and co-authored more than 100 papers. He is a keynote speaker at 2015 IEEE International Conference on the Edges of Innovation for Smarter Cities and a founding member of Lehigh INE (Integrated Networks for Electricity) Cluster

Title: Software Reliability Engineering: Algorithms and Tools

Thursday, 2/1/18 | 11am-12pm | West Hall, W106

Speaker: Dr. Lance Fiondella

Abstract: While there are many software reliability models, there are relatively few tools to automatically apply these models. Moreover, these tools are over two decades old and are difficult or impossible to configure on modern operating systems, even with a virtual machine. To overcome this technology gap, we are developing an open source software reliability tool for the software engineering community. A key challenge posed by such a project is the stability of the underlying model fitting algorithms, which must ensure that the parameter estimates of a model are indeed those that best characterize the data. If such model fitting is not achieved, users who lack knowledge of the underlying mathematics may inadvertently use inaccurate predictions. This is potentially dangerous if the model underestimates important measures such as the number of faults remaining or overestimates the mean time to failure (MTTF). To improve the robustness of the model fitting process, we have developed expectation maximization (EM) and expectation conditional maximization (ECM) algorithms to compute the maximum likelihood estimates of nonhomogeneous Poisson process (NHPP) software reliability models. This talk will present an implicit ECM algorithm, which eliminates computationally intensive integration from the update rules of the ECM algorithm, thereby achieving a speedup of between 200 and 400 times that of explicit ECM algorithms. The enhanced performance and stability of these algorithms will ultimately benefit the software engineering communities that use the open source software reliability tool..

Bio: Dr. Lance Fiondella is an assistant professor in the Department of Electrical & Computer Engineering at the University of Massachusetts Dartmouth. He received his PhD (2012) in Computer Science & Engineering from the University of Connecticut. Dr. Fiondella has published over 100 peer-reviewed journal articles and conference papers. Eight of his conference papers have been recognized with awards, including four with his students. His research has been funded by the Department of Homeland Security, Army Research Laboratory, Naval Air Systems Command, National Aeronautics and Space Administration, and National Science Foundation, including a CAREER award. He has supervised four Master’s theses and is the doctoral advisor of four PhD students

Title: Predictive Modeling of Drug Effects: Learning from Biomedical Knowledge and Clinical Records

Monday, 1/29/18 | 11am-12pm | West Hall, W105

Speaker: Dr. Ping Zhang

Abstract: Drug discovery is a time-consuming and laborious process. Lack of efficacy and safety issues are the two major reasons for which a drug fails clinical trials, each accounting for around 30% of failures. By leveraging the diversity of available molecular and clinical data, predictive modeling of drug effects could lead to a reduction in the attrition rate in drug development. In this talk, I will introduce my recent work on machine-learning techniques for analyzing and predicting clinical drug responses (i.e., efficacy and safety), including: 1) integrating multiple drug/disease similarity networks via joint matrix factorization to infer novel drug indications; and 2) revealing previously unknown effects of drugs, identified from electronic health records and drug information, on laboratory test results. Experimental results demonstrate the effectiveness of these methods and show that predictive models could serve as a useful tool to generate hypotheses on drug efficacy and safety profiles.

Bio: Ping Zhang is a Research Staff Member at the Center for Computational Health, IBM T. J. Watson Research Center. His research focuses on machine learning, data mining, and their applications to biomedical informatics and computational medicine. He has published more than 35 peer-reviewed scientific articles in top journals and conferences (e.g., Nucleic Acids Research, BMC Bioinformatics, Journal of the American Medical Informatics Association, KDD, AAAI, ECML, SDM, and CIKM) and filed more than 15 patent applications. Dr. Zhang has served on the program committees of leading international conferences, including KDD, IJCAI, UAI, and AMIA, and on the editorial boards of CPT: Pharmacometrics & Systems Pharmacology and Journal of Healthcare Informatics Research. He won the best in-use/industrial paper award for ESWC 2016 and received a Marco Ramoni Distinguished Paper nomination at AMIA Summits 2014.

Title: Scalable Parallel Computing of LargeScale Graph Analytics for Big Data

Friday, 1/26/18 | 11am-12pm | West Hall, W105

Speaker: Dr. Maleq Khan

Abstract: Graph analytics plays a critical role in the analysis of data from diverse sources such as the Internet, social networks, computational biology, scientific simulations, and finance. Graph-based analysis of data offers valuable insights and can lead to the discoveries of hidden patterns in massive datasets. Traditional algorithms do not work well for big data. Moreover, such huge data may not fit in the memory of a single processing unit, and thus, require distributed systems where the data is distributed among multiple compute nodes. High performance computing platforms such as MPI, Spark and Hadoop, are now essential parts of big data analytics as they provide frameworks for parallel and distributed computing with thousands of processing cores. In this talk, I will present some of my recent work on studying dynamics on networks using graph analytics and developing scalable parallel algorithms for some problems in large-scale graph analytics. I will begin the talk with a study of using graph analytics in understanding disease dynamics on social contact networks and how some properties of a network affect the dynamics on the network. Then I will discuss some unique challenges in massive-scale graph analytics posed by big data and present some of my work on developing scalable distributed-memory parallel algorithms for generating random graphs, counting triangles, and subgraph analysis. These algorithms scale very well to a large number of cores (more than a thousand cores) and can work on graphs with billions of edges and vertices efficiently.

Bio: Maleq Khan is an Assistant Professor in the Department of Electrical Engineering and Computer Science at Texas A&M University–Kingsville. He received his Ph.D. in Computer Science from Purdue University in 2007 and then worked in the Biocomplexity Institute of Virginia Tech as a postdoc and as a research scientist before joining Texas A&M University– Kingsville in 2016. His research interests are big data analytics, high performance computing, parallel and distributed computing, data science and data mining. His research work received a best paper award (DISC 2006) and a best paper award nomination (SC 2016). Additional details about Dr. Khan and his work can be found at www.maleqkhan.net.

Title: Programmable Logic Controller Forensics

Monday, 1/22/18 | 11am-12pm | West Hall, W105

Speaker: Dr. Irfan Ahmed

Abstract: Industrial control systems (ICS) are used to control and monitor our critical infrastructures such as oil and gas pipeline, power grid, and nuclear plants. Over the past several years, we have witnessed several cyberattacks on these physical industrial and infrastructure processes including Ukraine power grid and Iranian nuclear plants. Digital Forensic investigation is crucial to answering many questions about a cyberattack. Since ICS environments are significantly different from traditional IT systems, they pose serious limitations on current digital forensics tools and techniques. In this talk, Dr. Ahmed will talk about the denial of engineering operations (DEO) attacks that deceive the engineering software during the attempts to retrieve the digital evidence (control-logic program) remotely from a compromised programmable logic controller (PLC). To enable the forensic investigation during DEO attacks, Dr. Ahmed will further present a decompiler Laddis, which extracts a binary control-logic program from an ICS network traffic and then, decompiles it into its high-level representation in a human-readable form. Laddis is evaluated on the programs of varying complexity and demonstrates perfect reconstruction of the original control logic.

Bio: Dr. Irfan Ahmed is a Canizaro-Livingston Endowed Assistant Professor in Cybersecurity at the University of New Orleans and an Associate Director of the Greater New Orleans Center for Information Assurance (GNOCIA). His primary research interests include digital forensics and system security with emphasis on non-traditional IT environments such as industrial control systems and virtualized infrastructure. Dr. Ahmed is the recipient of two Best Paper Awards from well-known cybersecurity conferences and an Outstanding Research Award from the American Academy of Forensic Sciences. Recently, the University of New Orleans has awarded him the Early Career Research Prize to recognize his outstanding creative and scholarly activities.

Title: Natural Language Processing for the Privacy of Internet Users

Friday, 1/19/18 | 11-12pm | West Hall, W105

Speaker: Dr. Shomir Wilson

Abstract: Although research shows that internet users care about their privacy, they do not have the ability to read and understand the privacy policies of all the websites they visit or all the apps they use. Fixing this gap in online notice and choice is the goal of the Usable Privacy Policy Project, an NSF funded project to extract salient details from privacy policies and present them to internet users in ways that are responsive to their needs. I will present my ongoing work as the lead for the project's natural language processing and crowdsourcing efforts. Our results show that crowdworkers can answer questions about privacy policies with high accuracy and automated methods can identify important details in policy texts, such as statements about data collection and users' privacy options. I will then present some vignettes from my research on online social network privacy and entity linking, along with a long-term goal of "user-oriented natural language processing" to break down the most complex texts that people are obligated to read and automatically find the details that affect them the most.

Bio: Shomir Wilson is an Assistant Professor in the Department of Electrical Engineering and Computer Science at the University of Cincinnati, where he leads the Human Language Technologies Lab and is a member of the Institute for Analytics Innovation. His interests span pure and applied research in natural language processing, privacy, and data science. Previously he held postdoctoral and lecturer positions in Carnegie Mellon University's School of Computer Science, and he spent a year as an NSF International Research Fellow at the University of Edinburgh's School of Informatics. He received his Ph.D. in Computer Science from the University of Maryland in 2011.

Fall 2017

Title: Towards Real-Time Deformable Registration for Image Guided Neurosurgery

Friday, 12/1/17 | 12:00-1:00pm | West Hall, W105

Speaker: Nikos Chrisochoides
CRTC Lab, Computer Science Dept.
Old Dominion University

Abstract: In image-guided neurosurgery, co-registered preoperative anatomical, functional, and diffusion tensor imaging can be used to facilitate a maximally safe resection of brain tumors in eloquent areas of the brain. However, because the brain can deform significantly during surgery, particularly in the presence of tumor resection, non-rigid registration of the preoperative image data to the patient is required. In this talk I shall present: (i) the evolution of an Adaptive Physics-Based Non-Rigid Registration (APBNRR) method which registers preoperative and intraoperative brain MRI and (ii) the results from a comparison with three other readily available and widely used registration methods from 30 glioma cases performed at two different hospitals in US and China. The APBNRR improves the accuracy of deformable registration by more than 5 times compared to rigid and existing physics-based non-rigid registration and reduces the end-to-end execution time to within the time constraints imposed by the neurosurgical procedure.

Bio: Nikos Chrisochoides is the Richard T. Cheng Distinguished Professor of Computer Science and John Simon Guggenheim Fellow in Medicine & Health. He was elected Distinguished Visiting Fellow in the Royal Academy of Engineering in the UK. His current research interests are in exascale/quantum and real-time medical image computing. Nikos received his Ph.D. in 1992 from Computer Science at Purdue University. He worked at Northeast Parallel Architectures Center in Syracuse and Advanced Computing Research Institute at Cornell. In 1997 he joined the Computer Science & Engineering Dept. at Notre Dame where he received his NSF CAREER Award. In 2000 he joined the College of William and Mary where he was awarded the Alumni Memorial Professorship. He has held visiting positions at MIT, Harvard Medical School and Brown University. He received more than $14.5 million for his research on high-performance and medical image computing and he has more than 230 publications.

Exploring the Frontiers of Blockchain Design

Friday, 11/10/17 | 12:00-1:00pm | West Hall, W105

Speaker: Dr. Hongsheng Zhou

Bio: Hong-Sheng Zhou is an Assistant Professor in the Computer Science Department at Virginia Commonwealth University (VCU). He was a postdoc at Maryland Cybersecurity Center, the University of Maryland, under the direction of Jonathan Katz. Before that, he received his PhD at the University of Connecticut with Aggelos Kiayias as advisor. Hong-Sheng is currently working on multiple directions in cryptography including Cryptocurrency and Blockchain Technologies, Post-Snowden Cryptography, Secure Computing, and constantly publish in top-tier cryptography and cybersecurity, and distributed computing conferences including CRYPTO, Eurocrypt, CCS, PODC. More information about his research can be found his group page http://cryptographylab.bitbucket.io

Abstract: Cryptocurrencies like Bitcoin have proven to be a phenomenal success. The underlying techniques hold a huge promise to change the future of financial transactions, and eventually our way of computation and collaboration. At the heart of the Bitcoin system is a blockchain, that records transactions between users in consecutive time windows. The blockchain is maintained in the open network environment i.e., the Internet, by a peer-to-peer network of nodes called Bitcoin miners via the so-called proofof-work mechanism. In this talk, I will first review Bitcoin's proof-of-work based blockchain protocol, and present the main challenges and dream goals in the design space. Then we will together explore the frontiers of blockchain design, including how to design blockchain via alternative mechanisms (e.g., using proof-of-stake, proof-of-space, trusted hardware), and how to extend blockchain with new features.

Title: Modelling and optimizing failure-aware topologies of transport networks

Friday, 11/03/17 | 12-1pm | West Hall, W105

Speaker: Dr. Roza Goscien

Abstract: Recently, the telecommunication networks have become an indispensable part of the society everyday life. They support a variety of human activities, including business, entertainment, social life, therefore, they have to continuously evolve and develop to meet the increasing users' requirements and expectations. Some of the most important trends regarding current transport networks development are cloud- and content-readiness as well as survivable service provisioning. The talk gives an outline on the possible solutions, related to both – network infrastructures and traffic engineering tools, for modeling and optimizing efficient and attack-/failure-aware topologies of transport networks. The general approaches and presented as well as their application for a real case study.

Bio: Roza Goscien is an Assistant Professor at the Department of Systems and Computer Networks, Wroclaw University of Science and Technology (Wroclaw, Poland). She received the PhD degree with distinctions in computer science from the same university in 2016 wherein her PhD thesis was awarded as best thesis presented in 2016 by the Polish Prime Minister. Her research interest is mainly focused on modeling and optimization of survivable communication networks. Currently, she is involved in several research projects related to modeling and optimization of efficient and survivable optical networks including the project funded by the Polish National Army. Additionally, she was awarded the 2014 Fabio Neri Best Paper Award from the Optical Switching and Networking journal and the Best Paper Award at the RNDM 2015 conference.

Title: Evaluating machine learning algorithms part 1: metrics and data sampling

Friday, 10/13/17 | 12-1pm | West Hall, W105

Speaker: Dr. Bartosz Krawczyk

Abstract: Machine learning algorithms are one of the most popular tools for contemporary data mining and data science. Due to increasing accessibility of out-of-shelf solutions and software packages with user-friendly interfaces, they become widely used not only by domain experts, but also by every-day users that look for efficient problem-solving tools. However, when using such learning algorithms one must be aware of the importance of understanding what we are actually doing. Proper evaluation is the key to conducting data mining correctly. However, evaluation standards differ from person to person and discipline to discipline. How do we decide which standards are right for us? Additionally, evaluation gives us a lot of, sometimes contradictory, information. How do we make sense of it all? This talk will give an overview of existing metrics for evaluating machine learning methods and provide guidelines for selecting proper ones for the task at hand. Data sampling will be discussed in details, showing how to properly select training, validating and testing instances. This talk is the first part of lectures on how to design and conduct proper machine learning experiments.

Bio: Bartosz Krawczyk is an assistant professor in the Department of Computer Science, Virginia Commonwealth University, Richmond VA, USA, where he heads the Machine Learning and Stream Mining Lab. He obtained his MSc and PhD degrees from Wroclaw University of Science and Technology, Wroclaw, Poland, in 2012 and 2015 respectively. His research focuses on machine learning, data streams, ensemble learning, class imbalance, one-class classifiers, and interdisciplinary applications of these methods. He has authored 40+ international journal papers and 90+ contributions to conferences. Dr. Krawczyk received numerous prestigious awards for his scientific achievements like IEEE Richard Merwin Scholarship, IEEE Outstanding Leadership Award, and Best PhD Thesis Award from Polish Association for Artificial Intelligence among others. He served as a Guest Editor in four journal special issues and as a chair of twelve special session and workshops. He is a member of Program Committee for over 50 international conferences and a reviewer for 30 journals.

Title: My 45 years in Computational Science: Three High Performance Computing Examples

Friday, 10/6/17 | 12:00-1:00pm | West Hall, W105

Speaker: Tarynn M Witten, PhD, FGSA
Professor and Director of R&D, Center for the Study of Biological Complexity, Virginia Commonwealth University

Abstract: Computer Science can be divided up into two parts (for the purposes of this presentation): Computing Science and Computational Science (also known as Scientific Computing). Moreover, within the field of Scientific Computing, there exists a research subset called Computational Biosciences and Medicine. In this presentation, I will discuss three high performance, computational science projects that I carried out over my career. These are as follows: (1) Simulation and visualization of a slice of cat brain, (2) Estimation of the variation of parametric estimations in Gompertz survival model applications and (3) A currently ongoing project involving exhaustive combinatorics enumeration of mortality subsets of worldwide populations. We will see how none of these projects could have been carried out without the use of high performance computing and how computational science enhances the larger domain of what we currently call computer science.

Bio: Dr. Tarynn M. Witten earned her Ph.D. in Theoretical Biology and Biophysics at the Center for Theoretical Biology, SUNY Buffalo. She is currently a Professor of Biological Complexity at the Center for the Study of Biological Complexity, VCU. Dr. Witten is the holder of the Inaugural Nathan W. and Margaret Shock New Investigator Research Award for her work on the use of supercomputing algorithms to study the effects of sample size estimation on mortality patterns in different animal species. She authored the first Encyclopedia of Computer Science entry on Computational Biology and Medicine, has co-authored/edited over 10 books on computational biomedicine and has served as a consultant in supercomputing in biomedicine for such companies as ETA Systems, Control Data Corporation, Cray Computing, and Thinking Machines. Dr. Witten is the former Director of Applications Research and Development as well as the Associate Director of the University of Texas System Center for High Performance Computing. In 1994 she organized and chaired the First World Congress on Computational Medicine, Public Health, and Biotechnology and in 2005 she was awarded the Apple Computer Distinguished Educator of the Year Award for her use of high performance computing in teaching at the high school level.

Title: Basic Understanding of Patent Law Concepts

Friday, 9/29/17 | 12:00-1:00pm | West Hall, W105

Speaker: John Lyon

Bio: John Lyon is a senior associate at Thomas Horstemeyer, LLP, an intellectual property law firm headquartered in Atlanta. John graduated from Georgia Tech with a degree in Computer Science and attend law school at Georgia State University. John’s practice areas include patents, trademarks, and open source license compliance. If you ask John how he got to where he is today, his answer will be “the scenic route.”

Synopsis: This presentation discusses the patentability of software and software related inventions. The presentation aims to educate the audience regarding the types of software related inventions that are eligible for patent protection using a series of case studies of issued software patents. Key takeaways of this presentation will include a basic understanding of patent law concepts such as “abstractness,” “novelty,” and “obviousness.”

Title: How-To Start a Career in Cybersecurity

Friday, 9/22/17 | 12:00-1:00pm | West Hall, W105

Speaker: Brian Erdelyi

Bio: Brian Erdelyi studied Civil Engineering at Carleton University in Ottawa, Canada. In 1994 he left Carleton University and began working at IBM as an Advisory System Engineer where he was involved in the support and development of first generation Internet commerce and web content filtering software. For the past 20+ years, Brian has worked exclusively in cybersecurity as Information Security Officer, Cybersecurity Manager and Director of Global IS Risk Management for financial services and technology companies such as EDULINX Canada, Blackmont Capital, Capgemini, Manulife Financial, Clarien Bank and NTT Data. Brian has developed the Information Security curriculum for Bermuda College and built global cybersecurity teams. Brian has lived and worked as a cybersecurity thought leader throughout Canada, US, Saudi Arabia and Bermuda. Most recently he enjoys the flexibility and adventures as a cybersecurity consultant by architecting security solutions or security programs for leading companies like TD Bank, BMO, Morgan Stanley and Altria.

Abstract: Cybersecurity attacks and data breaches are growing in frequency and sophistication. Demand for cybersecurity professionals continues to outpace the supply. In 2014 Cisco warned there was a global shortage of 1 million cybersecurity professionals. By 2021 it’s predicted that there will be 3.5 million unfilled global cybersecurity jobs.Although cybersecurity spans all areas of IT, it offers it’s own unique career opportunities. Join this session to learn more about the different roles within security, what skills are in demand, how you can plan to develop the skills that companies need and tips to help you get noticed as a cybersecurity professional. There will also be opportunities to ask questions to help you navigate the field of cybersecurity

Check back often for more seminars.

Previous seminars