Tag: 2023
All the talks with the tag "2023".
Faster R-CNN - Towards Real-Time Object Detection with Region Proposal Network
Sagar Prakash BaradPublished: at 07:30 PMIn this talk, we discuss Faster R-CNN, a unified end-to-end object detection network that significantly improved object detection by introducing the concept of a Region Proposal Network (RPN). The key idea behind it was to use the RPN to generate region proposals, which are essentially potential bounding boxes containing objects of interest, and the model then uses these initial proposals for further classification and refinement of the bounding boxes.
3D Gaussian Splatting for Real-Time Radiance Field Rendering
Annada Prasad BeheraPublished: at 07:30 PMIn this talk, we present a novel method for real-time radiance field rendering using 3D Gaussian splatting. Our approach achieves state-of-the-art visual quality while maintaining competitive training times and allows high-quality real-time novel-view synthesis at 1080p resolution. We introduce three key elements that enable this - 3D Gaussians for scene representation, interleaved optimization/density control, and a fast visibility-aware rendering algorithm. We demonstrate the effectiveness of our method on several established datasets.
Formal Mathematics Statement Curriculum Learning
Rahul VishwakarmaPublished: at 02:00 PMIn this talk, we explore the use of expert iteration in the context of language modeling applied to formal mathematics. We show that at the same compute budget, expert iteration, which means proof search interleaved with learning, dramatically outperforms proof search only. We also observe that when applied to a collection of formal statements of sufficiently varied difficulty, expert iteration is capable of finding and solving a curriculum of increasingly difficult problems, without the need for associated ground-truth proofs.
Implicit Neural Representations with Periodic Activation Functions
Annada Prasad BeheraPublished: at 02:00 PMImplicitly defined, continuous, differentiable signal representations parameterized by neural networks offer many benefits over conventional representations. We propose leveraging periodic activation functions for implicit neural representations, dubbed sinusoidal representation networks (SIRENs). SIRENs are ideally suited for representing complex natural signals and their derivatives, and can solve challenging boundary value problems.
Are Transformers Effective for Time Series Forecasting?
Jyotirmaya ShivottamPublished: at 07:30 PMThis talk discusses a recent paper that compares the effectiveness of transformer-based architectures to simple linear (NN) models for long-term time series forecasting tasks. The paper concludes that linear models outperform transformers in these tasks and provides a hypothesis for this observation. We will also briefly discuss some recent papers that use transformers for time series forecasting and end with a discussion on a literature gap in this domain.
Exploring Long-term (Time-)Series Forecasting (LTSF) using Echo State Networks (ESNs) and comparisons with Single-Layer Perceptron (SLP), MLP, LSTM and especially Attention-based methods
Jyotirmaya ShivottamPublished: at 12:14 AMThis talk will explore Echo State Networks (ESNs) and their applications in Long-term (Time-)Series Forecasting (LTSF). We will compare ESNs with Single-Layer Perceptron (SLP), Multi-Layer Perceptron (MLP), Long Short-Term Memory (LSTM) networks, and especially attention-based methods for LTSF.